WO2017155466A1 - Method and system for visitor tracking at a pos area - Google Patents

Method and system for visitor tracking at a pos area Download PDF

Info

Publication number
WO2017155466A1
WO2017155466A1 PCT/SG2017/050111 SG2017050111W WO2017155466A1 WO 2017155466 A1 WO2017155466 A1 WO 2017155466A1 SG 2017050111 W SG2017050111 W SG 2017050111W WO 2017155466 A1 WO2017155466 A1 WO 2017155466A1
Authority
WO
WIPO (PCT)
Prior art keywords
visitor
data
area
master user
image
Prior art date
Application number
PCT/SG2017/050111
Other languages
English (en)
French (fr)
Inventor
Lin Xiaoming
Ravichandran PRASHANTH
Original Assignee
Trakomatic Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Trakomatic Pte. Ltd. filed Critical Trakomatic Pte. Ltd.
Priority to AU2017231602A priority Critical patent/AU2017231602A1/en
Priority to CN201780018258.6A priority patent/CN109074498A/zh
Publication of WO2017155466A1 publication Critical patent/WO2017155466A1/en
Priority to PH12018501919A priority patent/PH12018501919A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Definitions

  • the invention relates to a method and a system for visitor tracking at a P OS area.
  • the invention also relates to a method for running a master user database on an online and offline shopping server.
  • C CS C S outh C entral C onference
  • This method is based on evaluating well-normalized local histograms of image gradient orientations in a dense grid. Local object appearance and shape can often be characterized rather well by the distribution of local intensity gradients or edge directions, even without precise knowledge of the corresponding gradient or edge positions. This is implemented by dividing the image window into small spatial regions, so-called cells and accumulating for each cell a local 1 -D histogram of gradient directions or edge orientations over the pixels of the cell. The combined histogram entries form the representation. Tiling the detection window with a dense (in fact, an overlapping) grid of HOG descriptors results in a huma n detection chain.
  • E ach pixel of each cell is then compared to each of its neighboring pixels clockwise or counter-clockwise.
  • the difference in size results in numbers which are combined in a histogram showing the frequency of each number occurring.
  • the histograms of all cells are concatenated, which gives the feature vector of the image.
  • G ray-level co-occurrence matrices are another approach for texture classification in computer vision (http://www.fp.ucalgary.ca/mhallbey/tutorial.htm).
  • a G LC M considers the relation between two pixels at a time.
  • the G LC M represents the spatial relationship between groups of two (usually neighboring) pixels, called the reference and the neighbor pixel. T he neighbor pixel is often the pixel with a distance of one pixel to the reference pixel in a predetermined direction, but the distance between these pixels can also be any other number other than one.
  • the matrix is a tabulation of how often different combinations of pixel brightness values, which are grey levels in the case of a grey level image, occur in the image.
  • textures can be measured and classified.
  • C osta A. F., H umpire-Mamani, G ., T raina, A.J .M.
  • S IBG RAPI An Efficient Algorithm for F ractal Analysis of Textures, published in G raphics, Patterns and Images (S IBG RAPI), 2012, 25th S IBG RAPI C onference on 22-25 Aug.
  • S FTA S egmentation-based Fractal Texture Analysis
  • solveP nP is a function of the openCV api (http://docs.opencv.Org/2.4/modules/calib3d /doc/camera_calibration_and_3d_ reconstruction.html).
  • T he function estimates an object pose given a set of object points, their corresponding image projections, as well as a ca mera matrix and distortion coefficients.
  • F irst object points (3D), image points (2D) and the camera matrix are input into the solveP nP -function.
  • solveP nP returns a rotation vector (rvec) and a translation vector (tvec) with which points from the model coordinate system can be mapped to the camera coordinate system.
  • the camera coordinate system is the camera's cartesian coordinate system, it moves with the camera, and the camera is always at the origin.
  • US 6,71 1 ,293 B1 discloses a method and an apparatus for identifying scale invariant features in an image and use of the same for locating an object in the image, the method being called scale invariant feature transform (S IFT).
  • the method is performed by locating pixel amplitude extrema in a plura lity of difference images which are produced from the initial image. F irst local maximal and minimal amplitude pixels in each difference image are identified. Then possible maximal and minimal ampli- tude pixels are identified, followed by identifying the actua l maximal and minimal a mplitude pixels.
  • a plurality of component subregion descriptors for each subre- gion of a pixel region about the pixel amplitude extrema in the plurality of difference images is produced.
  • interesting points on the object can be extracted from the plurality of component subregion descriptors to provide a so- called "feature description" of the object. This feature description can then be used to identify the object when attempting to locate the object in another image.
  • E P 1 850 270 B1 describes a method for determining an interest point in an image having a plurality of pixels suitable for working at different scales and/or rotations.
  • the method produces a local feature detector and descriptor being called S peeded Up R obust Features (S U R F).
  • T he method comprises filtering an image using at least three digital filters and then selecting an interest point based on determining a measure resulting from application of the digital filters.
  • the measure is a non-linear combination of the outputs of the digital filters and captures variations of an image parame- ter in more than one dimension or direction.
  • SURF's feature descriptor is based on the sum of the Haar wavelet response around the point of interest S U R F descriptors can be used for object recognition, localization, tracking, registration and classification, 3D reconstruction and extracting points of interest in an image.
  • M. Turk and A. P entland; E igenfaces for R ecognition, J ournal of Cognitive Neurosci- ence, vol. 3, no. 1 , pages 71 -86, 1991 discloses a method for recognizing human faces with computer systems.
  • a collection of face images is transformed by a mathematical process called P rinciple C omponent Analysis (P CA) into a collection of face images with a low-dimensional representation.
  • P CA P rinciple C omponent Analysis
  • T LD tracking learning detection
  • This classifier bootstraps itself by using a current tracker state to extract positive and negative examples from a current frame. S light inaccuracies in the tracker can therefore lead to incorrectly labeled training examples, which degrades the classifier and can cause further drift. Using Multiple Instance Learning (MIL) instead of traditional supervised learning avoids these problems, and can therefore lead to a more robust tracker with fewer parame- ter tweaks. F urther, a novel online MIL algorithm for object tracking is disclosed.
  • MIL Multiple Instance Learning
  • An object of the present invention is to provide a method and a system for automatically tracking a visitor in a P OS area, wherein the quality of services based on data achieved by tracking a visitor in the P OS area is improved.
  • a further object of the present invention is to improve the data content of a master user database of an online shopping system.
  • a first aspect of the present invention relates to a method for visitor tracking at a P OS area comprising the steps of
  • biometric face templates of a master user database are each assigned to a user account of an online shopping system and the master user database comprises online shopping behavior data for each user. If a match between the biometric face template of the visitor and one of the biometric face templates of the master user databases is found, then the content of the master user database is used for supporting a service at the P OS area.
  • This visitor tracking method detects visitors of the POS area by means of the face in that a biometric face template is extracted from an image showing the face of the corresponding visitor.
  • This biometric face template is used to access data of a user a ccount of this visitor in an online shopping system. These data comprise online shop- ping behavior data. At the P OS area services can be provided being based on such online shopping behavior data.
  • the online shopping behavior can comprise a browsing behavior and/or a shopping behavior.
  • the browsing behavior comprises a list or table of U R Ls of websites visited by the user of the online shopping system which can be combined with time stamps or statistical data, particularly how often and/or how long the respective websites were be visited by the user.
  • the shopping behavior comprises data of the purchased goods and/or services. These shopping behavior data can further comprise timestamps of each purchase, information of the value of each purchase and infor- mation about the payment method.
  • These services can comprise services to the visitor or customer, respectively, services in assisting the sales staff of the P OS area and services to the management of a retail shop using this P OS area.
  • An example of the service to a visitor is to automatically send a message to a mobile device of the visitor, containing information about the retail shop using the POS area, special offers or any other product information.
  • the online shopping behavior data are considered in generating this message.
  • T he visitor will receive information that can be individually adapted to their preferences and wishes.
  • F urthermore, terminals can be provided in the POS a rea for the visitors in which the visitor can look up information about the retail shop using this P OS area.
  • the information provided at such a terminal can be personalized to the user on the basis of their online shopping behavior data retrieved from the master user database.
  • the sales staff of the retail business at the POS area can be automatically informed about the shopping behavior of the visitor of the POS area, who is a potential customer of the POS area.
  • the sales staff can provide individually adapted se rvices and offers to the visitors of the P OS area.
  • the amount of data received from the master user database can be limited according to the respective law governing data protection and data security.
  • the management of the shop does not need any pe rsonal data of the visitors of the POS area for the statistical analysis.
  • Anonymous data which do not contain a name, detailed address, telephone number are mostly suffi- cient.
  • visitor data extracted from images ca p- tured in the P OS area can be stored in a POS database which is independent from the master user database.
  • the P OS database preferably is a local database connected to a visitor tracking system for tracking the visitors in the P OS area.
  • This POS database describes the offline shopping behaviour. T his offline shopping behaviour can also be used for supporting the above mentioned services as long as the POS database contains a sufficient basis of data.
  • One of the services provided to the visitors is generating and transmitting a message to a mobile device of the visitor.
  • T he contact data of the mobile device can be re- trieved from the master user database.
  • P referably, it is checked at the POS area whether the mobile device is present in the POS area. This can be carried out by checking communication data exchanged between a transceiver in the POS area and mobile devices located in the POS area.
  • a visitor of the POS area is tracked by capturing one or more images of the visitor by one or more cameras and offline shopping behavior data are preferably extracted from those images.
  • Offline shopping behavior data is e.g. based on the time a visitor is spending in front of certain products, advertisements and displays.
  • F urthermore, offline shopping behavior data can be based on data received from a cash register at the P OS area when the visitor is purchasing certain goods or services.
  • T hese data can comprise an identification and an amount of goods and services purchased by the visitor.
  • the P OS area preferably comprises several types of stationary cameras.
  • S uch a camera can be a facial detection camera that is arranged so that the field of view is directed towards the face of a visitor in an upright position.
  • Tracking cameras can be provided, which are tracking the visitors of the POS area with a field of view from above so that a plurality of visitors can be tracked simultaneously by one camera.
  • the present invention provides a method wherein a master user database is run on a shopping server, which is part of an online shopping system.
  • a user may log in to the shopping system by means of a client computer comprising a client camera.
  • An image of the user is captured by the client camera and a biometric face template is extracted from this image, wherein the user's image and/or the user's biometric face template is stored in the master user database.
  • This method provides a master user database containing biometric face templates of each user.
  • T he biometric face template can be used as an identification for the users so that the content of the master user database, particularly the content of accounts of specific users, can be easily connected or related to other data on another system, which are related to the same person by means of the biometric face template.
  • This method is particularly advantageous in combination with the above-described first aspect of the invention, because all data retrieved from the tracking at the P OS area of a certain visitor can be easily linked to the account data in the master user database of the same person by means of the biometric face templates.
  • offline shopping behavior data retrieved by tracking a visitor in a P OS area can be added or supplemented to a master user database, if a biometric face template of the visitor matches with one of the biometric face te mplates of the master user database, which means that the visitor is the same person as the user of the account of the master user database comprising the matching biometric face template.
  • the information content or information value of the master user database is increased significantly, because data of the real world are automatically detected and added to the master user database.
  • the mas- ter user database comprises a very broad basis of data which is highly valuable for statistical analysis.
  • each time the user is logging into the online shopping system an image of the user is captured and a biometric face template can be extracted and stored in the master user database.
  • T his storing of the biometric face template does not necessarily mean that an already present face template is completely replaced by a new biometric face template, but can also be carried out in that an already existing biometric face template is updated by the data of the newly delivered biometric face template.
  • the biometric face template that is gene rated during a log-in procedure can be used as logging criteria for the online shopping system.
  • each time the user logs into the online shopping system at least one image of the user is captured.
  • S uch a master user data base comprising combined online and offline shopping behavior data can be used for preparing shopping user profiles, data for supporting advertisements or for generating further marketing characteristics.
  • the invention refers to a system for visitor tracking at a POS area which is embodied for carrying out one or more of the methods described above.
  • S uch a system preferably comprises at least one facial detection camera, at least one or more tracking cameras, a central control unit connected to the cameras.
  • the tracking system preferably comprises a storage means hosting a local database, which is called POS database.
  • the central control unit is preferably connected to a mobile shopping system comprising a remote storage means hosting a master user database.
  • Neighboring cameras of the tracking system are preferably arranged so that the fields of view are overlapping to such an extent that at least one person can be simultaneously detected by both neighboring cameras.
  • the tracking system is preferably connected by means of a data line to a cash register of a POS area, so that data describing payment transactions are available to the tracking system. T he sales transactions can be synchronized to the tracking system by using timestamp synchronization.
  • F igure 1 schematically a tracking system at a P OS area
  • F igure 2 a tracking camera with its field of view
  • F igure 8 and 9 each schematically a further embodiment of a tracking system and its relationship to an online shopping system
  • F igure 10 and 1 1 each the data structure of a data set of a POS database a nd a master user database.
  • F igure 1 shows an embodiment of a tracking system 1 for visitor tracking at a POS area 2.
  • the P OS area 2 comprises a point of sale (P OS ) 3 at which a visitor or customer, respectively, of the P OS area 2 can finalize a purchase process.
  • the point of sale 3 comprises a cash register 4.
  • a point of sale is not limited to such a cash register but is any physical point at which any kind of pur- chase, buying, hiring, renting process between a party which offers a certain good and/or service and a customer can be finalized.
  • the finalizing of a purchase process can also be carried out by means of a mobile device instead of a fixed cash register.
  • the POS area is any physical area in which goods a nd/or se rvices are presented to initiate purchase a process.
  • the POS area 2 comprises one or more displays or advertisements for displaying the offered goods and/or services.
  • a visitor 5 of the POS area 2 can walk along the displays and make up their mind whether they want to buy or rent the offered products or services.
  • T he P OS area 2 is usually part of a physical real world shop.
  • the P OS area 2 can also be embodied as a commercial or non-commercial fair for offering goods and services such as a business fair or flea market. Any area comprising displays or advertisements belongs to the P OS area, such as external areas of a shop building comprising advertisement windows or other displays.
  • the tracking system 1 comprises several tracking cameras 6 which are arranged along the walking paths in the P OS area 2.
  • the tracking cameras 6 are embodied as overhead cameras having its direction of view directed downward so that these tracking cameras 6 are imaging the visitors 5 with a viewing direction from above.
  • E ach tracking camera 6 is a digital camera for generating elec- tronically readable image files.
  • E ach image file comprises at least one image taken of a certain field of view 7 which is defined by a lens of the tracking camera 6.
  • the image file can also comprise a video stream which is a continuous sequence of images.
  • the tracking cameras 6 are arranged so closely to each other that the fields of view 7 of neighboring cameras are overlapping.
  • the tracking system 1 comprises at least one facial detection camera 8 which is a digital camera arranged with its direction of view in the POS area 2, so that an image of a face of a visitor 5 having their head in an upright position can be taken.
  • a facial detection camera 8 is located in an entrance area 9 of the P OS area 2 and a second facial detection camera 8 is located at the point of sale 3.
  • the facial detection camera 8 at the entrance area is directed with its direction of view from the inside of the P OS area to the entrance area so that the faces of the users entering the POS area 2 are captured. It can also be useful to have a facial detection camera 8 directed with its direction of view from the outside of the POS are to the entrance area so that the faces of the users leaving the POS area 2 are ca ptured. With this camera 8 all visitors are captured independently whether they were buying an article or they were just passing the POS area without buying anything.
  • a path starting with the entrance area 9 and ending at the point of sale 3 is continuously monitored by the cameras 6, 8. It is also possible to provide cameras for monitoring areas in front of an external advertisement window or any other external display.
  • the facial detection cameras 8 define each a field of view 10. T he first facial detection camera 8/1 and a first tracking camera 6/1 located in the entrance area 9 are arranged so that the corresponding fields of view 7, 10 are overlapping. A visitor 5 can so be simultaneously detected by the first tracking camera 6/1 and the first facial detection camera 8/1 .
  • E ach camera 6, 8 is connected to a data line 1 1 of a local area network.
  • the system also comprises a central control unit 12 and a storage means 13 for hosting a local database 14, which is in the following called "P OS database" 14.
  • the central control unit 12 is connected to a wide area network (WAN) such as the internet 15 or an intranet.
  • WAN wide area network
  • the central control unit 12 has a data connection via the WAN 1 5 to a remote storage means 1 6 hosting a master user database 17.
  • the master user database 17 is part of an online shopping system (not shown in the figures).
  • the online shopping system comprises a web page which can be displayed on a client computer for offering the user of the client computer certain goods and/or services.
  • the client computer can be a desktop computer or any other mobile or stationary device having an interface to be connected to the WAN and display and input means for displaying the web page of the online shopping system and entering corresponding orders.
  • S uch client computers are typically mobile phones, tablets, handheld computers, etc.
  • the online shopping system is embodied in such a way that a user has to register themselves for using the online shopping system.
  • registration data of the user are stored in the master user database 17.
  • the registration data are allocated in corresponding user accounts and comprise the full name, email address, ma iling address and phone number of the user.
  • the registration data also comprise a biometric face template.
  • the signing up can be carried out by means of a user name and a password. S uch a signing up procedure is provided for the case that either the user is using a client computer without camera or someone else being authorized by the user is using the user's online shopping account.
  • the biometric face template is extracted from an image of the face of the user provided to the online shopping system.
  • a social media account can also be used for signing up to the online shopping system, which makes the images of the user's face available to the online shopping system.
  • An image of a user's face can also be provided in a different way than by means of a social media account.
  • the online shopping for example can require an image of the user face as necessary part of the registration data which have to be supplied by the user at the first registration for the online shopping system.
  • the client computer comprises a camera having a direction of view di- rected at the user of the client computer and a software application for capturing an image of the user each time the user is signing up to the online shopping system. Then this image can be used for the signing up to the online shopping system.
  • a biometric face template is extracted by the software application of the client computer and this client biometric face template is compared to the biometric face templates of the master user database for authorizing the user of the client computer for signing up to the mobile shopping system. Only if there is a match between the client's biometric face template and the corresponding face template of the master user database a user is allowed to virtually enter the online shopping system.
  • This login procedure can be combined with an additional security means such as a password.
  • biometric face template as key for authorizing a user to virtually enter the shopping system on one hand provides a high security for the online shopping system that only a registered person is using the corresponding account of the online shopping system and on the other hand provides an easy access to the online shopping system to the user.
  • T he biometric face template can also be combined with further data for signing up to the online shopping system, such as a password.
  • a further advantage of signing up to the online shopping system by means of a bio- metric face template is that each time that the user signs up to the online shopping system an actual image of the user and a biometric face template is generated and at least either the image or the biometric face template is forwarded to the online shopping system. Therefore, the biometric face template included in the registration data of the master user database can be updated each time that the user signs up to the online shopping system. This can help improve the accuracy over an extended period of time where there are physical (biological) changes in the person's appearance such as natural aging, make-up and changing hairstyles.
  • images of a social media account provides sometimes further valuable information about the user. E .g. often the places or G eo-locations where the images were taken are linked to the corresponding images. S uch the travel behaviour of the users can be derived and used for e.g. offering certain travel services.
  • a biometric face template is extracted from the user's image on the client computer and forwarded to the master user database.
  • the client computer forwards the user's image to the online shopping system, where in the online shopping system that biometric face template of the image is extracted and compared to the biometric face templates included in the master user database.
  • the image as well as the biometric face templates of the user can be stored in the corresponding registration data of the master user database.
  • the tracking system 1 comprises a face detection module 19 which is designed for detecting one or more faces in an image captured by one of the facial detection cam- eras 8.
  • the face detection module receives 19 one or more images or frames respectively, from the facial detection camera 8 (step S 1 , figure 5). E ach frame is analyzed by means of the HAAR cascade Viola-J ones method for containing one or more faces (step S 2).
  • step S 3 For each face a rectangular section of the image is cropped as face thumbnail 18 (step S 3).
  • F rom each face thumbnail face features are extracted (step S4).
  • the eyes are detected in each face thumbnail by means of the HAAR cascade Viola-J ones method.
  • the face thumbnail is normalized.
  • F rom the norma lized face thumbnail the face features are extracted.
  • T he extraction of the face fea- tures can be carried out by histogram of gradients (HOG ), local binary patterns (LB P), speed up robust features (S U R F), scale-invariant feature transformation (S IFT) and/or by E igen-features. These methods for extracting face features are known and can be used separately or in combination.
  • step S 5 This vector forms the bio- metric face template.
  • This detection module is used for generating the biometric face templates of visitors captured with one of the facial detection cameras 8.
  • the face detection module can also be embodied for generating the biometric face templates of visitors captured with one of the facial detection cameras 8 by means of a method as described in the S ingapore Patent Application No. 10201 504080W (date of filing: 25 May 201 5).
  • This S ingapore Patent Application No. 10201504080W is incorporated by reference. According to this method an image is captured which contains a face and which is further processed by
  • Non-facial attributes are particularly used for tracking users at the same day and/or in the same vicinity and/or the same P OS area. If the face detection module 19 of the tracking system 1 detects a face a corresponding visitor data set is generated and stored in the POS database 14. T he visitor data set comprises, at least for the time being, all information which is generated and collected by the tracking system 1 of the corresponding visitor, such as one or more im- ages of this visitor, the biometric face template etc.
  • S uch a face detection module can also be provided in the online shopping system described above or in the corresponding software application at the client computer for extracting the biometric face templates of images taken of the user by means of the camera of one of the client computers.
  • the tracking system 1 comprises a human detection module for detecting humans in the images captured by one of the tracking cameras 6.
  • the detection of the humans is carried out by histogram of gradients (HOG ) and/or by means of HAAR -features.
  • the human detection module is trained in a training phase before using it to track people in the P OS area 2.
  • All cameras 6, 8 are calibrated with respect to each other. S uch a calibration can be carried out by means of a chess board grid which is placed during the calibration phase in the P OS area 2 so that the chess board grid is visible for at least two neighboring cameras simultaneously.
  • a chess board grid which is placed during the calibration phase in the P OS area 2 so that the chess board grid is visible for at least two neighboring cameras simultaneously.
  • software modules such as cv2.findC hessboardcorners which yield camera matrices and distortion coefficients.
  • the tracking system 1 comprises a first tracking module 21 which is designed for passing person information from one of the facial detection cameras 8 to the neighboring tracking camera 6.
  • the first tracking module 21 receives from the face detection module 19 person information of a person whose face was detected by the face detection module 19 (step S 6, figure 6).
  • This person information can comprise the biometric face template and/or an image captured of the person by one of the facial detection cameras 8.
  • This person information can also comprise pre-processed coordinate information relating to the person shown in the image captured by the facial detection camera 8.
  • the first tracking module 21 transforms the person information generated by the facial detection camera 8 into three-dimensional coordinates of the person in a world coordinate system. S pecific points of a person are determined in the coordinates in a coordinate system of the facial detection camera 8 (step S 7). These person coordi- nates are then transformed to a world coordinate system by rotating and translating the set of person coordinates by means of solveP NP .
  • the first tracking module 21 also receives an image captured by one of the tracking cameras 6 that is located next to the facial detection camera 8 and which have an overlapping field of view (step S 8).
  • the image captured by the tracking camera 6 is taken simultaneously to the one captured by the facial detection camera 8.
  • the image of the tracking camera 6 is analyzed by means of the human detection module 20 to extract further person information, which comprises two-dimensional coordinates.
  • the two-dimensional coordinates are transformed into three- dimensional coordinates (step S 9).
  • the three-dimensional coordinates of the person captured with the tracking camera 6 are then transformed into world coordinates by rotating and tra nslating the set of co- ordinates (step S 10) e.g. by means of solveP NP .
  • step S 1 1 the world coordinates of the person captured by the facial detection camera 8 and of the person captured by the tracking camera 6 are compared. If there is three-dimensional spatial overlap with respect to a certain threshold, then this will be determined as that the coordinates captured by the two cameras 6, 8 relate to the same person.
  • step S 9 the program flow returns to step S 9 for extracting coordinates of a further person shown in the image.
  • step S 1 1 If it is determined in step S 1 1 that the coordinates extracted from the images of the two cameras 6, 8 overlap and thus relate to the same person, the data extracted from the images of the two cameras 6, 8 are concatenated by storing them in the same visitor data set (step S 12).
  • the method described above requires that both cameras 6, 8 have an overlapping field of view 7, 10.
  • the size of the overlap determines the area where positions of a detected visitor can be compared.
  • a second tracking module 22 is provided in the tracking system 1 for generating a walking trail by means of images captured by one of the tracking cameras 6.
  • the second tracking module 22 receives an image from the tracking camera 6 (step S 13, figure 7).
  • a rectangular representation 23 (figure 2) of a person shown in the image is detected (step S 14) by histogram of gradients (HOG) and/or HAAR-like features (step S 14).
  • a Person Model is trained on the basis of the rectangular representation 23.
  • T he P erson Model represents the specific person shown in the image and can include a combination of shape (HOG, HAAR etc.), points of interest (S IFT, S UR F etc.), a texture (G LC M: grey level co-occurrence matrix, S FTA: segmentation-based fractal tex- ture analysis) in various color spaces (HSV, R G B etc.).
  • a further image is received from that tracking camera 6 (step S 15).
  • step S 1 6 a further rectangular representation 24 is extracted in a similar way as in step S 14.
  • An overlap of the rectangular representations 23, 24 is determined (step S 17). On the basis of this overlap it can be determined how far the person has moved between the moments when the first image and the subsequent image were captured.
  • step S 1 6 If in step S 1 6 no further rectangular representation 24 can be determined, then the Person Model is used for detecting the person in the image (step S 18). This can be the case, if the person has changed their position in such a way, e.g. by bending forward, that no further rectangular representation can be determined.
  • the Person Model is more generic, so that coordinates of the person can be determined even if the position of the person has substantially changed.
  • step S 19 the coordinates of the person in the different images is added to the vis i- tor data set together with the corresponding time stamps. These data represent a walking trail of the person In the POS area 2.
  • steps S 1 6 to S 18 further information about the person is extracted from the images, it is possible to improve the Person Model on the basis of this information.
  • T LD tracking learning detection
  • online MIL online mul- tiple instance learning
  • mean shift mean shift
  • the tracking system 1 comprises a third tracking module 25 for joining walking trails generated by two tracking cameras 6.
  • the walking trails of two tracking cameras 6 can be joined if the fields of view 7 of the two cameras 6 overlap (figure 3, 4).
  • a visitor 5 located in the overlapping range of the fields of view 7 is simultaneously tracked by both cameras 6. This means that both tracking cameras 6 simultaneously capture an image in which the visitor 5 is shown.
  • the world coordinates of the visitor are determined in the same way as described above with the steps S 8 to S 10
  • the coordinates of the visitor are compared and if they overlap within a certain threshold then this is determined as being the same person shown in the images taken simultaneously by the two different tracking ca m- eras 6.
  • T he coordinates and time stamps of the walking trails generated by each tracking camera 6 are then stored in the same visitor data set so that the walking trails of the different tracking cameras 6 are connected.
  • T hus it is possible to connect walking trails of several pairs of neighboring tracking cameras 6 by means of the third tracking module 25 so that a visitor can be tracked throughout the complete area which is covered by the tracking cameras 6.
  • the control unit 12 comprises a database administration module 26 which provides access to the P OS database 14 as well as to the master user database 17.
  • VT M vector to match
  • the VT M is compared to corresponding vectors of user data sets in the master user database.
  • T he data set comprising a vector that is closest to the VT M is determined by means of a match score.
  • T he best (typically highest) match score represents the closest match.
  • the visitor data set generated by the system 1 and describing the shopping behavior of a visitor 5 of the POS area 2 is transmitted to the master user database 17 and stored in the account of the corresponding user.
  • the data of the account of the online shopping system is supplemented by offline shopping behavior data based on the physical activities of the visitor 5 in the real world, namely the P OS area 2.
  • all information generated in the tracking system 1 and relating to a certain visitor 5 are stored in the account of the online shopping system of the visitor instead of storing it in the local POS database 14.
  • the data set generated in the system 1 for describing the behavior of the corresponding visitor 5 in the P OS area 2 is stored in the local POS database 14.
  • the P OS database 14 is also called “anonymous database", because it describes visitors 5 of the POS area 2 without a name, address or any contact data of the visitors.
  • the data set of the P OS database 14 comprises the biometric face template of the visitor and/or one or more images of the visitor, so that the biometric face template of a newly detected visitor 5 can also be compared with the data sets of the P OS database 14, if no matching account is available in the master user database 17.
  • T he provision of a biometric face template and/or one or more images showing the face of the visitor allows to check the POS database 14 each time when a new user is registering for a new account in the master user database 17 whether the P OS database 14 comprises a data set describing the real life shopping behavior of the new user of the online shopping system. If this is the case, the corresponding data set can be transferred to the master user data base 17 and stored together with the account in- formation of the new user.
  • the system 1 comprises a service module 27 which provides several service functions to the visitors 5 of the P OS area 2, to sales staff of the P OS area 2 and to the management of the POS area 2.
  • the service module 27 automatically creates messages to a mobile device used by a visitor 5 which is present in the POS area 2.
  • the visitor 5 is identified by the face detection module 19 and if a corresponding data set in the master user database 17 is available, contact data comprised in this data set are used by the service module 27 to send a message to the visitor's mobile device.
  • the message can contain e.g. information about special offers.
  • T he content of the message can be generated automatically by considering the visitor's buying behavior which is stored in the corresponding data set or account in the master user data base 17 and which is based on the former activities of the visitor in the online shopping system and/or their s hopping behavior in a POS area also monitored by the system 1 according to the present invention.
  • communication data can be based on WIFI communication, Bluetooth communication or any other standard local area communication between the mobile device and corresponding transceivers in the P OS area 2 which are connected to the system 1 .
  • the online shopping system can also comprise functions for a loyalty program for customers of retail businesses.
  • the automatic identification of the visitor 5 in the P OS area 2 can also be used for such a loyalty program for customers of retail businesses, with which shoppers can collect points for purchases and redeem them for vouchers, goods or money.
  • T he user is automatically identified by capturing their face and all goods and services purchased by the visitor are automatically registered by the cash register 4, which is connected to the centra l control unit 12 by means of the data line 1 1.
  • the customer retention can easily be improved without a need for the customer to perform any administrational activities.
  • the sales staff of the retail business at the P OS area 2 can automatically be informed by the service module 27 about the shopping behavior of the visitor present at the POS area 2.
  • the sales staff can provide individually adapted services and offers to the visitors 5 of the P OS area 2.
  • T he sales staff preferably uses mobile devices to which information about the shopping behavior of the visitor are transmitted automatically by the service module 27.
  • This shopping behavior can be displayed in a diagram showing values for typical characteristics of the visitor, such as interests in certain kinds of goods, credit rating of their credit cards, etc.
  • the service module 27 can generate a statistical analysis of all visitors visiting the POS area 2, wherein also the online shopping behavior, which is registered in the master user database 17 can be considered.
  • E ach visitor can be classified very precisely, so that the information content of such a statistical analysis is very significant and useful for the management of the retail business using this P OS area 2.
  • S uch statistical analyses are very valuable for assessing the value of a POS area. With these statistical analyses not only the footfall of the POS area 2 can be detected but also the type of customer can be classified. It is also possible to analyze the efficiency of advertisement and displacements in the POS area 2 and their effect on the cus- tomers.
  • a significant improvement in managing and controlling of retail business is achieved, because not only data of the real world retail business (offline buying behavior), but also data of the online buying behavior are considered. The basis for the data is enlarged significantly, when the data of the real shopping world and the data of the virtual shopping world are automatically connect- ed.
  • a bove information relating to the image of the face, such as the biometric face template, of a user is used for registering in the online shopping system and is also used for identifying a visitor 5 in the POS area 2.
  • a bove information relating to the image of the face such as the biometric face template
  • the data sets of the visitors 5 are only stored in the P OS database 14 if no corresponding matching account in the master user database 17 is available.
  • F igure 10 and 1 1 show examples of the data structure of the P OS database 1 4 and the master user database 17.
  • the POS database 14 comprises data sets 29 for visitors who enter the P OS area 2.
  • E ach data set of a certain visitor comprises at least a face template generated by means of the face detection module 19.
  • the face template is used as a biometric identifier.
  • the data set 29 can comprise one or more face templates.
  • a single face template can also be based on several images which are captured on subsequent visits to the P OS area 2.
  • the data set 29 can comprise walk-in information.
  • the walk-in information comprises a list of visits or physical walk-in sessions, respectively, of the respective visitor.
  • T he walk-in information can also comprise a time stamp of the moment when the user has entered the P OS area 2. If the POS area 2 comprises several entrance areas 9, then the walk-in information can be an identification code of one specific entrance area 9.
  • T he walk-in information can further comprise a facial-based age and/or gender which can be extracted from the images by means of the face detection module 19.
  • This data set 29 of the POS database can additionally comprise information about the shopping trail.
  • the shopping trail is recorded by means of coordinates of the visitor and corresponding time stamps.
  • T he shopping trail can be a complete shopping trail from the entrance area 9 to the point of sale 3.
  • the shopping trail ca n also be a partial shopping trail of a visitor who enters a POS area 2 and leaves the P OS area 2 without finalizing a purchase process.
  • S uch a partial shopping trail can also be rec- orded at an external area of a shop building at which the user is looking at an advertisement window or any other display.
  • the data set of the user can also comprise information about the purchased items and the purchase time.
  • T hese data are provided by the cash register 4 when a pur- chase process is finalized. These data usually comprise an ID of each purchased article, the price and a timestamp.
  • the face template is generated by the face detection module 19. This face template is compared to the existing face templates in the P OS database. If a similar face template is found then this is judged as the visitor present in the POS area 2 being the same person to whom the existing data set belongs. In such a case all data generated by monitoring this visitor is entered in the existing data set.
  • E ach data set 29 can thus comprise several face templates, several pieces of walk-in information, several shopping trails and of course several pur- chased items with different purchase times.
  • the master user database comprises data sets 30 for each user (F igure 1 1 ).
  • E ach data set contains at least personal information and preferably a face template.
  • the personal information can comprise email address, physical address, phone number and/or any personal details which are usually acquired upon a user signing up to the online shopping system.
  • the data set comprises one or more face templates which are used as biometric identifiers.
  • T he face template can be extracted from one or more images captured when the user is signing up to the online shopping system.
  • the face template can also be based on one or more images captured from the tracking system 1 at the POS area 2.
  • P referably the face template of the data set 30 of the master user database has the same or a similar structure as the face template of the data set 29 of the POS database. This makes it easy to calculate the distance between a face te mplate of the POS database and a face template of the master user database and a utomatically judging whether these face templates belong to the same human being.
  • This data set can further comprise a time stamp when the user was signing up to the online shopping system for the first time.
  • the data set can also comprise a unique communication ID of a client device used by the user or an app used on the client device for communicating with the online shopping system.
  • the client device can be a desktop computer or any mobile device such as a mobile phone, tablet etc.
  • the communication ID can be a telephone number, email address, IP address, S kype address, an app-ID or a combination thereof.
  • This data set can also contain online shopping behavior data.
  • the online shopping behavior data comprise browsing behavior data and/or shopping behavior data.
  • T he browsing behavior data describe the websites that were browsed by the user.
  • T he shopping behavior data describe the goods and/or services purchased with the online shopping system. These data can comprise corresponding time stamps.
  • the shopping behavior data can also comprise information on one or more abandoned online-carts.
  • the browsing behavior data and/or shopping behavior data can be rec- orded as plain data describing each item individually by a data value. As these data can grow to a large amount of data it can be appropriate to carry out a statistical analysis and to record statistical values, such as mean values, standard deviations etc. for describing the behavior. With such statistical values certain detailed data can be omitted and the amount of data can be reduced. This makes it easier and faster to browse the data describing the browsing behavior and the shopping behavior.
  • the respective data of the POS database 14 (walk-in & time; shopping trail; offline purchase time and items) are added to the data set of this user.
  • the data describing the face templates of the POS database and the master user database are preferably combined. This combination can be carried out by just copying the face template data of the POS database to the data set 30 of the master user database. This combination can alternatively be carried out by calculating a new single face template which is based on both the face template of the data set 29 of the POS database as well as the face template of the data set 30 of the master user database.
  • the data structure of the data sets of the databases 14, 17 as shown in Figures 10 and 1 1 can be used in the above-described first embodiment of the tracking system 1.
  • the data of the online shopping system are supplemented with of- fline shopping behavior data based on the physical activities of the visitor in the real world.
  • T his improves the online shopping system and makes it possible for the user of the online shopping system to be provided with information also selected on the basis of their offline shopping behavior.
  • the online shopping system can also comprise the function of a loyalty program for customers of retail businesses, with which shoppers can collect points for purchases and redeem them for vouchers, goods or money.
  • This tracking system a visitor of a certain P OS area 2 is monitored and data of each visit are recorded in the POS data base. If such a visitor becomes a user of the online shopping system then they already have a history describing their offline shopping behavior in the POS area 2.
  • the online shopping system can also be used mainly for such a loyalty program and even need not comprise a purchase function with which the users can carry out online purchases.
  • S uch an embodiment of the online shopping system allows a rela- tively small shop or a small chain of shops to easily establish and run a loyalty program which is highly efficient, because it detects every visit of the corresponding customers and thus considers every shopping activity of said customer in real time. T he information about the shopping activities are very detailed and this information is generated without the need to enter any data manually.
  • F igure 8 schematically shows a further embodiment, wherein the same parts are designated with the same reference signs. T herefore a further description of these parts can be omitted.
  • the P OS database 14 is used to store the data sets of each visitor 5 visiting the P OS area 2 at least once. If a matching account is available in the master user database 17 of an online shopping system 28, the relevant data of the account are read from the master user data base 17 and supplemented to the data set of the corresponding visitor 5 in the P OS database 14. In this tracking sys- tern 1 the P OS database 14 is supplemented by the information enclosed in the master user database 17.
  • F igure 9 shows a further embodiment of the tracking system 1 , which is similar to the one of figure 8.
  • tracking system 1 is connected to several online shopping systems 28 which are independent from each other.
  • the P OS database 14 is supplemented by the content of the master user databases 17 of the different online shopping systems 28. T hus, the databases for the service functions ca n be enlarged even more.
  • the embodiments according to F igures 8 and 9 are particularly advantageous for using the basis of the data of large online shopping systems in real world shops. S uch a system allows to run a real world shop more efficiently, because due to the detailed data of the online shopping system the services provided to the potential customers can be improved significantly and the management board of the real world shop know their customers much better than without the data of the online shopping system.
  • the online shopping system described above in the first embodiment can be connected to several real world tracking systems 1 so that the content of the master user data base 17 is automatically supplemented by the shopping behavior in a plurality of POS areas 2.
  • the online shopping system contains information on the users, which is also based on the real world shopping behavior (offline shopping behavior), which makes the corresponding statistical data for describing the shopping behavior of the user much more reliable for statistical analysis.
  • This data can also be used for the online shopping system.
  • E .g. analyzing the time a visitor 5 is spending in front of a certain advertisement or displacement in the POS area 2 provides detailed information about the personal interests and wishes of the corresponding visitor 5.
  • the data structure of the databases 14, 17 used in the embodiments according to F igures 8 and 9 differs from the one shown in F igures 10 and 11 in that certain data of the master user database (personal information, sign-up time stamp, communica- tion ID and/or online shopping behavior) are added to the corresponding data sets of the POS database 14.
  • certain data of the master user database personal information, sign-up time stamp, communica- tion ID and/or online shopping behavior
  • modules 19, 20, 21 , 22, 25, 26, and 27 are embodied as software modules which are stored and executable on the central con- trol unit 12.
PCT/SG2017/050111 2016-03-09 2017-03-08 Method and system for visitor tracking at a pos area WO2017155466A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2017231602A AU2017231602A1 (en) 2016-03-09 2017-03-08 Method and system for visitor tracking at a POS area
CN201780018258.6A CN109074498A (zh) 2016-03-09 2017-03-08 用于pos区域的访问者跟踪方法和系统
PH12018501919A PH12018501919A1 (en) 2016-03-09 2018-09-07 Method and system for visitor tracking at a pos area

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201601838T 2016-03-09
SG10201601838TA SG10201601838TA (en) 2016-03-09 2016-03-09 Method and system for visitor tracking at a pos area

Publications (1)

Publication Number Publication Date
WO2017155466A1 true WO2017155466A1 (en) 2017-09-14

Family

ID=59789842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2017/050111 WO2017155466A1 (en) 2016-03-09 2017-03-08 Method and system for visitor tracking at a pos area

Country Status (5)

Country Link
CN (1) CN109074498A (zh)
AU (1) AU2017231602A1 (zh)
PH (1) PH12018501919A1 (zh)
SG (1) SG10201601838TA (zh)
WO (1) WO2017155466A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108122012A (zh) * 2017-12-28 2018-06-05 百度在线网络技术(北京)有限公司 常驻点中心点的确定方法、装置、设备及存储介质
CN109492626A (zh) * 2019-01-11 2019-03-19 敏科信息科技(广州)有限公司 一种无感人脸案场管理方法及其系统
CN111565225A (zh) * 2020-04-27 2020-08-21 银河水滴科技(北京)有限公司 一种人物行动轨迹确定方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381982B (zh) * 2020-10-19 2022-02-22 北京科技大学 一种基于深度学习构建的无人超市系统
CN113706767B (zh) * 2021-10-27 2022-02-08 深圳市恒裕惠丰贸易有限公司 一种自动识别钱币面值的登记储存系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110257985A1 (en) * 2010-04-14 2011-10-20 Boris Goldstein Method and System for Facial Recognition Applications including Avatar Support
CN102592116A (zh) * 2011-12-27 2012-07-18 Tcl集团股份有限公司 一种云计算应用方法、系统及终端设备、云计算平台
CN103606093A (zh) * 2013-10-28 2014-02-26 燕山大学 一种基于人体特征的连锁机构vip客户智能服务系统
US20140244678A1 (en) * 2013-02-28 2014-08-28 Kamal Zamer Customized user experiences
CN104268725A (zh) * 2014-10-30 2015-01-07 胡新元 一种基于人脸识别技术的客户管理系统及方法
CN104346883A (zh) * 2013-08-04 2015-02-11 郁晓东 一种识别顾客的pos装置
CN104573619A (zh) * 2014-07-25 2015-04-29 北京智膜科技有限公司 基于人脸识别的智能广告大数据分析方法及系统
WO2015103970A1 (en) * 2014-01-09 2015-07-16 Tencent Technology (Shenzhen) Company Limited Method, apparatus and system for authenticating user

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011516966A (ja) * 2008-04-02 2011-05-26 グーグル インコーポレイテッド デジタル画像集合内に自動顔認識を組み込む方法および装置
US8521703B2 (en) * 2010-11-05 2013-08-27 International Business Machines Corporation Multiple node/virtual input/output (I/O) server (VIOS) failure recovery in clustered partition mobility
CN103839150B (zh) * 2012-11-20 2018-03-02 金蝶软件(中国)有限公司 Erp业务邮件处理方法和装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110257985A1 (en) * 2010-04-14 2011-10-20 Boris Goldstein Method and System for Facial Recognition Applications including Avatar Support
CN102592116A (zh) * 2011-12-27 2012-07-18 Tcl集团股份有限公司 一种云计算应用方法、系统及终端设备、云计算平台
US20140244678A1 (en) * 2013-02-28 2014-08-28 Kamal Zamer Customized user experiences
CN104346883A (zh) * 2013-08-04 2015-02-11 郁晓东 一种识别顾客的pos装置
CN103606093A (zh) * 2013-10-28 2014-02-26 燕山大学 一种基于人体特征的连锁机构vip客户智能服务系统
WO2015103970A1 (en) * 2014-01-09 2015-07-16 Tencent Technology (Shenzhen) Company Limited Method, apparatus and system for authenticating user
CN104573619A (zh) * 2014-07-25 2015-04-29 北京智膜科技有限公司 基于人脸识别的智能广告大数据分析方法及系统
CN104268725A (zh) * 2014-10-30 2015-01-07 胡新元 一种基于人脸识别技术的客户管理系统及方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108122012A (zh) * 2017-12-28 2018-06-05 百度在线网络技术(北京)有限公司 常驻点中心点的确定方法、装置、设备及存储介质
CN108122012B (zh) * 2017-12-28 2020-11-24 百度在线网络技术(北京)有限公司 常驻点中心点的确定方法、装置、设备及存储介质
CN109492626A (zh) * 2019-01-11 2019-03-19 敏科信息科技(广州)有限公司 一种无感人脸案场管理方法及其系统
CN111565225A (zh) * 2020-04-27 2020-08-21 银河水滴科技(北京)有限公司 一种人物行动轨迹确定方法及装置
CN111565225B (zh) * 2020-04-27 2023-08-04 银河水滴科技(宁波)有限公司 一种人物行动轨迹确定方法及装置

Also Published As

Publication number Publication date
CN109074498A (zh) 2018-12-21
SG10201601838TA (en) 2017-10-30
AU2017231602A1 (en) 2018-09-27
PH12018501919A1 (en) 2019-06-24

Similar Documents

Publication Publication Date Title
CN106776619B (zh) 用于确定目标对象的属性信息的方法和装置
US11790433B2 (en) Constructing shopper carts using video surveillance
Liciotti et al. Person re-identification dataset with rgb-d camera in a top-view configuration
Burton et al. Face recognition in poor-quality video: Evidence from security surveillance
WO2017155466A1 (en) Method and system for visitor tracking at a pos area
US8010402B1 (en) Method for augmenting transaction data with visually extracted demographics of people using computer vision
JP6500374B2 (ja) 画像処理装置及び画像処理プログラム
KR101779096B1 (ko) 지능형 영상분석 기술 기반 통합 매장관리시스템에서의 객체 추적방법
WO2016190814A1 (en) Method and system for facial recognition
Jaiswal et al. An intelligent recommendation system using gaze and emotion detection
Liu et al. Customer behavior classification using surveillance camera for marketing
CN111784396A (zh) 一种基于用户画像的双线购物追踪系统及方法
CA3014365C (en) System and method for gathering data related to quality of service in a customer service environment
CN109993595A (zh) 个性化推荐商品及服务的方法、系统及设备
US20210264496A1 (en) Machine learning for rapid analysis of image data via curated customer personas
US11068873B1 (en) Methods, systems, apparatuses, and devices for facilitating advertising of a product
Ma et al. Gaussian descriptor based on local features for person re-identification
US20230111437A1 (en) System and method for content recognition and data categorization
JP2017130061A (ja) 画像処理システム、画像処理方法およびプログラム
Wei et al. Subject centric group feature for person re-identification
CN112131477A (zh) 一种基于用户画像的图书馆图书推荐系统及方法
Denman et al. Identifying customer behaviour and dwell time using soft biometrics
Zhang et al. Real-time clothes comparison based on multi-view vision
JP6813039B2 (ja) 画像処理装置及び画像処理プログラム
Zhou et al. A framework for semantic people description in multi-camera surveillance systems

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017231602

Country of ref document: AU

Date of ref document: 20170308

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17763667

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17763667

Country of ref document: EP

Kind code of ref document: A1