EP3467709A1 - Gesichtserkennungsverfahren und -system zur persönlichen identifizierung und authentifizierung - Google Patents

Gesichtserkennungsverfahren und -system zur persönlichen identifizierung und authentifizierung Download PDF

Info

Publication number
EP3467709A1
EP3467709A1 EP18198611.8A EP18198611A EP3467709A1 EP 3467709 A1 EP3467709 A1 EP 3467709A1 EP 18198611 A EP18198611 A EP 18198611A EP 3467709 A1 EP3467709 A1 EP 3467709A1
Authority
EP
European Patent Office
Prior art keywords
customer
image
input image
features
specular reflection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP18198611.8A
Other languages
English (en)
French (fr)
Other versions
EP3467709B1 (de
Inventor
Felix Chow
Chiu Wa Ng
Chun Ho Yip
How Chun Lau
Shan Shan ZHENG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hampen Technology Corp Ltd
Original Assignee
Hampen Technology Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/727,717 external-priority patent/US10061996B1/en
Application filed by Hampen Technology Corp Ltd filed Critical Hampen Technology Corp Ltd
Publication of EP3467709A1 publication Critical patent/EP3467709A1/de
Application granted granted Critical
Publication of EP3467709B1 publication Critical patent/EP3467709B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7837Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
    • G06F16/784Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content the detected or recognised objects being people
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/7867Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0272Period of advertisement exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates generally to anti-spoofing in face recognition for personal identification, authentication, advertisement and other security purposes. Particularly, the present invention relates to a face recognition method and system for tackling spoofing with real-time reenactment video on a high definition display and identifying customers to provide customized services and targeted advertisements.
  • Face recognition has numerous security-related applications such as user identification, user authentication for online and offline resource accesses, door and gate unlocking.
  • face recognition methods use a real-time captured image of the subject's face to find a match in a library of previously captured facial images.
  • the face matching process is relatively accurate in comparison to other biometric identification methods with well-developed and tested facial feature analysis techniques.
  • the techniques extract characteristic data of a face region as shown in FIG. 1 , which is unique for each person, from the captured image of the subject's face, and compares the characteristic data against those stored in the library, thereby matching the subject's face to that of a previously registered person.
  • Some face recognition systems have been developed to determine whether the subject is a living person by ways of requiring the subject to make multiple facial expressions and movements during an identification session and allowing the system to detect and capture the multiple frames of facial expressions and movements for matching.
  • U.S. Patent No. 6,922,478 disclosed a method for verifying the authenticity of a captured image of a person comprising recording a sequence of consecutive individual images of the person and determining the authenticity of the recorded images by checking if at least in two consecutive individual images of the sequence intrinsic movements can be detected.
  • an authenticity verifying system may be deceived by printed photographs or an electronic display showing images of the subject to be authenticated as illustrated in FIG. 2 .
  • European Patent No. 1990770 disclosed a face authenticating apparatus includes a presentation pattern display unit provided at a different position from a key input unit to display an instruction for a user to input a key pattern during facial authentication; and an image capturing unit for capturing the face of the user and/or a movement of a portion of the face of the user during a portion of or the entire time from when the presentation pattern display unit displays the instruction to when the key input is completed. And from process executed by the apparatus, determine whether the captured face image is of a living person.
  • the requirement for interactive inputs limits its applications, usefulness, and the types of users.
  • U.S. Patent No. 9,619,723 disclosed a process of 3D perspective check comprising collecting two or more images of the subject's face. The two or more images of the subject's face are then used to calculate the stereoscopic view data of the subject's face.
  • a face recognition system could produce false rejection if the subject maintains perfect face alignment with the camera's view center as illustrated in FIG. 3 .
  • Face recognition has been known to be an effective way for personal identification.
  • Traditional face recognition systems usually capture the face of a particular subject and match it with a library of previously captured facial images in a one-to-one manner for security or authentication purposes.
  • a challenge in using face recognition systems for customer identification is that the number of customers to be identified at a particular business premise may not be known in advance. Also, it may be required to detect the presence of customers in a particular area of interest and automatically perform personal identification.
  • U.S. Patent No. 9,262,668 discloses a distant face recognition system comprising a primary and a plurality of secondary video cameras provided to monitor a detection area.
  • the primary video camera can detect people present in the detection zone.
  • Data can be then transmitted to a prioritizor module that produces a prioritized list of detected people.
  • the plurality of secondary video cameras then captures a high-resolution image of the faces of the people present in the detection area according to the prioritized list provided by the prioritizor module.
  • the high-resolution images can be then provided to a face recognition module, which is used to identify the people present in the detection area.
  • PTZ pan-tilt-zoom
  • US Patent No. 8,769,556 discloses a method and apparatus for providing targeted advertisements based on face clustering for time-varying video. During operation, video is continuously obtained of users of the system. Users' faces are detected and measured. Measurements of users' faces are then clustered. Once the clusters are available, advertisements are targeted at the clusters rather than individual users. However, as advertisements are targeted at the clusters rather than the individual users in such type of system, the content of the targeted advertisements cannot be personalized and relating to the targeted audience at the more personal level.
  • the method of personal identification and authentication comprises capturing an image of a subject to be authenticated; a step of face verification; followed by the process steps of a scan line detection test, a specular reflection detection test, and a chromatic moment and color diversity feature analysis test in no particular order.
  • the method requires a subject to present her face before a camera, which can be the built-in or peripheral camera of e.g. a mobile communication or computing device, a computer, or a stationary electronic device.
  • the method also requires displaying to the subject certain instructions and the real-time video feedback of the subject's face on a display screen, which can be the built-in or peripheral display screen of the mobile communication or computing device, computer, or stationary electronic device.
  • the step of face verification is to capture an image of the subject's face in a single frame shot, then preliminarily verifying the identity of the subject by matching the single frame face image against a database of pre-recorded face data records using existing face analysis and recognition techniques.
  • the scan line detection test is based on detecting Moiré patterns created by the overlapping of the digital grid of a spoof image from a digital media display and the grid of an image sensor of a camera in a face-recognition system.
  • the spoof image may be an image extracted from a pre-recorded or real-time reenactment video of a person's face displayed on a high-resolution display such as a liquid crystal display (LCD) display.
  • LCD liquid crystal display
  • the specular reflection detection test is based on the detection of specular reflection features of spoof images displayed in photographs or digital media displays having mirror or reflective surfaces. This is based on the general phenomenon that specular reflection is more likely to happen on a photo or a digital display which are usually of mirror or reflective surfaces whereas diffuse reflection happens on a genuine human face.
  • the specular reflection detection test comprises extracting multi-dimensional specular reflection features from the input image wherein the extraction comprises: discarding pixels of intensities outside of pre-defined range; and classifying the extracted specular reflection features to determine whether the input image is an image of a genuine face or a spoof image.
  • a support vector machine (SVM) based classifier trained with certain training sets is used to classify the extracted specular reflection features.
  • the chromatic moment and color diversity feature analysis test employs a process in which the chromatic features and color histogram of a spoof image, which can be a reproduced face image shown on a printed photo or displayed by a digital media display such as a LCD display, are analyzed to see if its color diversity is reduced in comparison with an image of a genuine face. This is based on the fact that reproduced face images have different color distribution compared to color distribution of genuine faces due to imperfect color reproduction property of printing and digital display.
  • the chromatic moment and color diversity feature analysis test comprises extracting the chromatic features and color histogram features of the input image in both hue, saturation, and value (HSV) value space, and red, green, and blue (RGB) color space; and classifying the extracted chromatic features and color histogram features to determine whether the input image is an image of a genuine face or a spoof image.
  • HSV hue, saturation, and value
  • RGB red, green, and blue
  • a SVM based classifier trained with certain training sets is used to classify the extracted chromatic features and color histogram features.
  • SVM scores of greater than or equal to zero signifies a positive detection whereas a negative value signifies a rejection.
  • an automatic identification and tracking method is provided to identify whether a customer entering a premise, such as a shopping mall or retail store, is a previously registered or remembered customer (or VIP), retrieve profile, demographical data and/or point of sale (POS) records of the customer, track location of the customer, and send the retrieved profile and tracked location of the customer to computing devices configured to be used by sales/service staffs.
  • VIP a previously registered or remembered customer
  • POS point of sale
  • the automatic identification and tracking method further comprises displaying targeted advertisements associated with demographical data of the customer in a frontend device to the customer; detecting a plurality of sentiments of the customer watching the targeted advertisements; measuring a dwell time of watching the targeted advertisements by the customer; and performing analysis on effectiveness of targeted advertisements based on the demographic data, the detected sentiments and measured dwell time of the customer.
  • the automatic identification and tracking method further comprises utilizing a plurality of cameras installed at various locations in the.
  • the various locations include, but not limited to, advertisement displays, signage devices, merchandise shelves, display counters, entrances, and exits.
  • the face recognition system can be implemented in a mobile communication device (e.g. "smartphone” and personal digital assistant), a mobile or personal computing device (e.g. “tablet” computer, laptop computer, and personal computer), a kiosk, or a user terminal having a built-in or peripheral camera and an electronic display screen.
  • a mobile communication device e.g. "smartphone” and personal digital assistant
  • a mobile or personal computing device e.g. “tablet” computer, laptop computer, and personal computer
  • kiosk e.g. "smartphone” computer, laptop computer, and personal computer
  • a user terminal having a built-in or peripheral camera and an electronic display screen.
  • the face recognition method comprises capturing an input image of an subject to be authenticated; conducting face verification 1001 to verify the identity of the subject by matching the input image against a database of pre-recorded face data records using a face analysis and recognition method, which can be based on presently available techniques; conducting anti-spoofing tests including a scan line detection test 1002, a specular reflection detection test 1003, and a chromatic moment and color diversity feature analysis test 1004.
  • a face analysis and recognition method which can be based on presently available techniques
  • conducting anti-spoofing tests including a scan line detection test 1002, a specular reflection detection test 1003, and a chromatic moment and color diversity feature analysis test 1004.
  • the method also requires displaying to the subject certain instructions and the real-time video feedback of the subject's face on a display screen, which can be the built-in or peripheral display screen of the mobile communication or computing device, computer, or stationary electronic device.
  • the step of face verification is to capture an image of the subject's face in a single frame shot, then preliminarily verifying the identity of the subject by matching the single frame face image against a database of pre-recorded face data records using existing face analysis and recognition techniques.
  • FIG. 4 shows an exemplary user interface of a face verification apparatus in accordance to an embodiment of the present invention.
  • the scan line detection test is to detect Moiré patterns, as shown in FIG. 5 , created by the overlapping of the digital grid from a digital media display and the grid of the image sensor of the camera in the face-recognition system to determine whether the input image is a spoof image provided with a digital media display such as LCD display.
  • Moiré patterns as shown in FIG. 5
  • FIGs. 7a and 7b peaks can be found in frequency domain of a spoof image captured from a LCD display due to the Moiré patterns whereas no peaks are found in frequency domain of an image captured from a living person.
  • the scan line detection test comprises the following steps:
  • the scan line detection test further comprises increasing the standard deviation 8 by an increment of ⁇ if p> p min ; repeating the afore-said steps from applying band-pass filtering to determining existence of peaks if ⁇ ⁇ max , where ⁇ max is a pre-defined maximum value of ⁇ ; and determining that no peaks exist in the input image if ⁇ > ⁇ max .
  • the specular reflection detection test is to detect specular reflection features of a mirror or reflective surface to determine whether the input image is a spoof image provided with a photograph or a digital media display.
  • the specular reflection detection test is based on the detection of specular reflection features of spoof images displayed in photographs or digital media displays having mirror or reflective surfaces. This is based on the fact that specular reflection is more likely to happen on a photo or a digital display which are usually of mirror or reflective surfaces whereas diffuse reflection happens on a genuine human face.
  • the specular reflective component is extracted from the input image.
  • the process involves separating the reflective component versus the diffuse component based on their chromaticity variation under varying light intensity. For diffuse color, the chromaticity stays constant.
  • the specular reflection detection test then comprises: extracting multi-dimensional specular reflection features from the input image, wherein the specular reflective features comprise a specular pixel percentage, an intensity value, and a variation of the pixels; discarding one or more pixels of intensities outside of a pre-defined intensity range (e.g. discarding the one or more pixels with intensities outside of the range [1 ⁇ , 5 ⁇ ], where ⁇ is the mean pixel intensity value); classifying, using machine learning techniques, the extracted specular reflection features to determine whether the input image is a spoof image.
  • a SVM based classifier trained with certain training sets is used to classify the extracted specular reflection features.
  • the chromatic moment and color diversity feature analysis is based on the general phenomenon of that reproduced face images have a different color distribution than that of genuine face images that is caused by imperfect color reproduction property of printing and digital displays. In addition, color diversity is also reduced in reproduced face images whereas genuine face images generally have richer colors.
  • the chromatic moment and color diversity feature analysis test is to detect if the color diversity of the input image is reduced in order to determine whether the input image is a spoof image provided with a print photograph or a digital media display.
  • the chromatic moment and color diversity feature analysis comprises extracting the chromatic features and color histogram features of the input image in both HSV and RGB spaces; classifying, using machine learning techniques, the extracted chromatic features and color histogram features to determine whether the input image is a spoof image.
  • a SVM based classifier trained with certain training sets is used to classify the extracted chromatic features and color histogram features.
  • the automatic identification and tracking method comprises Step 101: receiving a video stream from a camera installed in a premise, such as a shopping mall or retail store; Step 102: determining presence of a customer by face detection in the video stream; Step 103: extracting facial features of the customer; Step 104: matching the extracted facial features of the customer with previously registered customers' facial feature records in a database to determine whether the customer is registered customer (a VIP or a regular customer); Step 105: retrieving profile, which includes, but not limited to, demographical data, and POS records (e.g.
  • Step 106 tracking location of the customer based on the location of the camera;
  • Step 107 sending the retrieved profile, POS records, and tracked location of the customer to one or more computing devices configured to be used by sales/service staffs;
  • Step 108 selecting from a plurality of pre-defined advertisements one or more targeted advertisements based on the retrieved profile, POS records, and tracked location of the customer;
  • Step 109 sending notification and contents of the targeted advertisements to a mobile communication device of the customer.
  • the selection of targeted advertisements can be based on the featured products/services in the targeted advertisements that are determined to be relevant to one or more of the retrieved profile, POS records, and tracked location of the customer. For example, in the case where the demographical data indicates female in her late twenties with a recent purchase of a baby stroller and a tracked location of infant food section of the store, a selection of targeted advertisements featuring baby formulas is resulted.
  • machine learning techniques are employed in targeted-advertisements selection.
  • One such machine learning techniques makes use of historical data of POS records and profiles of a plurality of registered customers, and compares with the present customer's profile to identify similarities and patterns of purchases and in turn selects the targeted advertisements accordingly.
  • the aforesaid Step 102 and Step 103 are performed using the aforesaid face recognition methods and systems.
  • the automatic identification and tracking method further comprises Step 109: collecting the profile, including at least demographical data, of the customer if the customer is not a registered customer.
  • the automatic identification and tracking method further comprises Step 110: if the customer is a registered customer, displaying the targeted advertisements in a frontend device to the customer, otherwise, displaying random advertisements in the frontend device to the customer; Step 111: detecting the sentiments of the customer watching the advertisements being displayed in the frontend device; Step 112: measuring a dwell time of the customer watching the advertisements; and Step 113: determining the effectiveness of the advertisements based on the detected sentiments and measured dwell time of the customer. For example, if a pleasant sentiment, i.e.
  • the advertisement is considered to be effective.
  • a threshold dwell time e.g. 10 seconds
  • the advertisement is considered to be effective; otherwise ineffective.
  • an unpleasant sentiment i.e. physical display of distasteful emotions (including frown and indifferent emotion) is detected when a particular advertisement, type of advertisement, or advertisement of certain product or service is on display, the advertisement is considered to be ineffective.
  • the advertisement effectiveness information associated with the customer are recorded. Subsequent selection of targeted advertisements can then be based on the historical records of effectiveness information of types of advertisement and/or advertisement of certain products or services in addition to the retrieved profile, POS records, and tracked location of the customer.
  • the automatic identification and tracking method further comprises utilizing a plurality of cameras installed at various locations in the premise.
  • the various locations include, but not limited to, advertisement displays, signage devices, merchandise shelves, display counters, entrances, and exits.
  • This information is then sent to the one or more computing devices configured to be used by sales/service staffs, and analyzed for the customer's interests in goods and services and shopping preferences in Step 116. Consequently, the staffs can make use of the analysis results to better market goods and services, and provide a personalized shopping experience to each customer.
  • the automatic identification and tracking method preferably comprises dynamically controlling the lighting for capturing of the video streams by the digital video cameras so as to maintain high face recognition accuracy.
  • an automatic identification and tracking system 200 comprises at least one camera 201 for capturing video streams; an identification and tracking server 202 for identifying whether a customer is a registered customer (a VIP or a regular customer), retrieving profile, including demographical data, and POS records (e.g. including purchase history) of the customer, tracking location of the customer, and sending the retrieved profile and tracked location of the customer to a plurality of computing devices configured to be used by sales/service staffs; and at least one frontend device 203 for displaying a plurality of advertisements to the customer, collecting profile, including at least demographical data, of the customer through a user interface, which can be a graphical user interface displayed in an electronic touch screen of the frontend device 203.
  • the identification and tracking server 202 working in conjunction with camera 201, is further configured for capturing and detecting sentiments of the customer watching the advertisements, measuring a dwell time of watching the advertisements by the customer, and performing analysis on effectiveness of the advertisements from the detected sentiments and measured dwell times.
  • the identification and tracking server 202 comprises at least one storage media 204 for storing a database of facial features, demographical data, profiles, and types (e.g. VIP or regular) of registered customers, recorded sentiments and dwell times of the customers with references to the advertisements watched by the customers, and the associated advertisement effectiveness analysis results; at least one face recognition engine 205 for determining presence of customer, extracting facial features, matching the extracted facial features with registered customers' facial feature records in the database, detecting sentiments, meaning dwell times, and analyzing advertisement effectiveness.
  • at least one storage media 204 for storing a database of facial features, demographical data, profiles, and types (e.g. VIP or regular) of registered customers, recorded sentiments and dwell times of the customers with references to the advertisements watched by the customers, and the associated advertisement effectiveness
  • the cameras 201 are built-in or peripheral cameras of the frontend devices 203.
  • the frontend devices 103 comprise at least one electronic display.
  • the cameras 201 and the frontend devices 203 are positioned strategically at advertisement displays, merchandise shelves, display counters, entrances and exits of the premise.
  • the frontend device 203 is a signage device, such as a kiosk or an electronic billboard, having an electronic display.
  • the computing device configured to be used by sales/service staffs can be a personal computer, laptop computer, mobile computing device such as “smartphone” and “tablet” computer, or kiosk configured to execute machine instructions that communicate with identification and tracking server 202 and render a graphical user interface to display data received from the identification and tracking server 202 to and interact with the sales/service staff.
  • the mobile communication device of the customer can be a mobile computing device such as "smartphone” and "tablet” computer.
  • face recognition system and the automatic identification and tracking system comprise at least machine instructions for rendering and controlling a graphical user interface displayed on the electronic display, machine instructions for controlling the camera for capturing images and videos, machine instructions for performing the face recognition and anti-video-spoofing algorithms, and machine instructions for performing the customer identification and tracking; wherein the machine instructions can be executed using general purpose or specialized computing devices, computer processors, or electronic circuitries including, but not limited to, digital signal processors (DSP), application specific integrated circuits (ASIC), field programmable gate arrays (FPGA), and other programmable logic devices.
  • DSP digital signal processors
  • ASIC application specific integrated circuits
  • FPGA field programmable gate arrays
  • the systems include computer storage media having computer instructions or software codes stored therein which can be used to program computers or microprocessors to perform any of the processes of the present invention.
  • the storage media can include, but are not limited to, floppy disks, optical discs, Blu-ray Disc, DVD, CD-ROMs, and magneto-optical disks, ROMs, RAMs, flash memory devices, or any type of media or devices suitable for storing instructions, codes, and/or data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Library & Information Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)
  • Image Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
EP18198611.8A 2017-10-09 2018-10-04 Gesichtserkennungssystem zur persönlichen identifizierung und authentifizierung Active EP3467709B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/727,717 US10061996B1 (en) 2017-10-09 2017-10-09 Face recognition method and system for personal identification and authentication
US15/808,910 US20190108551A1 (en) 2017-10-09 2017-11-10 Method and apparatus for customer identification and tracking system

Publications (2)

Publication Number Publication Date
EP3467709A1 true EP3467709A1 (de) 2019-04-10
EP3467709B1 EP3467709B1 (de) 2020-04-29

Family

ID=63762369

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18198611.8A Active EP3467709B1 (de) 2017-10-09 2018-10-04 Gesichtserkennungssystem zur persönlichen identifizierung und authentifizierung

Country Status (3)

Country Link
US (1) US20190108551A1 (de)
EP (1) EP3467709B1 (de)
CN (1) CN109635623A (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4027266A1 (de) * 2021-01-06 2022-07-13 Amadeus S.A.S. Erkennung von moiré-mustern in digitalen bildern und system zur erkennung von lebendigkeit
US11670069B2 (en) 2020-02-06 2023-06-06 ID R&D, Inc. System and method for face spoofing attack detection

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10614436B1 (en) * 2016-08-25 2020-04-07 Videomining Corporation Association of mobile device to retail transaction
US11188944B2 (en) * 2017-12-04 2021-11-30 At&T Intellectual Property I, L.P. Apparatus and methods for adaptive signage
US10936854B2 (en) * 2018-04-27 2021-03-02 Ncr Corporation Individual biometric-based tracking
CN110276263B (zh) * 2019-05-24 2021-05-14 长江大学 一种人脸识别系统及识别方法
DE102019130527A1 (de) * 2019-11-12 2021-05-12 Shop-Iq Gmbh & Co. Kg Verfahren und Vorrichtung zur Unterstützung des Verkaufs von zum Verzehr vorgesehenen Waren an eine Vielzahl von Kunden
SE2050058A1 (en) * 2020-01-22 2021-07-23 Itab Shop Products Ab Customer behavioural system
US10885547B1 (en) * 2020-03-02 2021-01-05 Joseph Gottlieb Monitoring effectiveness of advertisement placement
CN111861576A (zh) * 2020-07-27 2020-10-30 合肥优恩物联网科技有限公司 一种基于自主人脸识别推送精准广告的系统及方法
CN111985504B (zh) * 2020-08-17 2021-05-11 中国平安人寿保险股份有限公司 基于人工智能的翻拍检测方法、装置、设备及介质
CN113705392B (zh) * 2021-08-16 2023-09-05 百度在线网络技术(北京)有限公司 工作状态切换方法、装置、设备、存储介质及程序产品
CN114820025A (zh) * 2022-03-17 2022-07-29 国家珠宝检测中心(广东)有限责任公司 基于图像识别技术的珠宝终端机的广告自动播放方法
US20240177189A1 (en) * 2022-11-29 2024-05-30 NexRetail Co., Ltd. Image data association method, system, apparatus and related computer program product
CN118015663B (zh) * 2024-04-09 2024-07-02 浙江深象智能科技有限公司 员工识别方法、装置及设备

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160071275A1 (en) * 2014-09-09 2016-03-10 EyeVerify, Inc. Systems and methods for liveness analysis

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2410359A (en) * 2004-01-23 2005-07-27 Sony Uk Ltd Display
US20070088607A1 (en) * 2005-10-04 2007-04-19 Tamago Measuring dwell time on an internet advertisement
US10169646B2 (en) * 2007-12-31 2019-01-01 Applied Recognition Inc. Face authentication to mitigate spoofing
US20100010890A1 (en) * 2008-06-30 2010-01-14 Eyeblaster, Ltd. Method and System for Measuring Advertisement Dwell Time
US20110257985A1 (en) * 2010-04-14 2011-10-20 Boris Goldstein Method and System for Facial Recognition Applications including Avatar Support
US9003196B2 (en) * 2013-05-13 2015-04-07 Hoyos Labs Corp. System and method for authorizing access to access-controlled environments
US10108977B2 (en) * 2013-08-23 2018-10-23 Oath Inc. Dwell time based advertising in a scrollable content stream
CN104573619A (zh) * 2014-07-25 2015-04-29 北京智膜科技有限公司 基于人脸识别的智能广告大数据分析方法及系统
US9251427B1 (en) * 2014-08-12 2016-02-02 Microsoft Technology Licensing, Llc False face representation identification

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160071275A1 (en) * 2014-09-09 2016-03-10 EyeVerify, Inc. Systems and methods for liveness analysis

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DAS DHRUBAJYOTI ET AL: "Face liveness detection based on frequency and micro-texture analysis", 2014 INTERNATIONAL CONFERENCE ON ADVANCES IN ENGINEERING & TECHNOLOGY RESEARCH (ICAETR - 2014), IEEE, 1 August 2014 (2014-08-01), pages 1 - 4, XP032722516, ISSN: 2347-9337, [retrieved on 20150116], DOI: 10.1109/ICAETR.2014.7012923 *
RINKU DATTA RAKSHIT ET AL: "Face Spoofing and Counter-Spoofing: A Survey of State-of-the-art", TRANSACTIONS ON MACHINE LEARNING AND ARTIFICIAL INTELLIGENCE, VOL. 5, NO. 2, 9 May 2017 (2017-05-09), pages 31 - 73, XP055559503, Retrieved from the Internet <URL:http://sseuk.org/index.php/TMLAI/article/view/3130> [retrieved on 20190220], DOI: 10.14738/tmlai.52.3130 *
WEN DI ET AL: "Face Spoof Detection With Image Distortion Analysis", IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, IEEE, PISCATAWAY, NJ, US, vol. 10, no. 4, 1 April 2015 (2015-04-01), pages 746 - 761, XP011575418, ISSN: 1556-6013, [retrieved on 20150312], DOI: 10.1109/TIFS.2015.2400395 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11670069B2 (en) 2020-02-06 2023-06-06 ID R&D, Inc. System and method for face spoofing attack detection
EP4027266A1 (de) * 2021-01-06 2022-07-13 Amadeus S.A.S. Erkennung von moiré-mustern in digitalen bildern und system zur erkennung von lebendigkeit
WO2022148635A1 (en) * 2021-01-06 2022-07-14 Amadeus S.A.S. Moiré pattern detection in digital images and a liveness detection system thereof

Also Published As

Publication number Publication date
CN109635623A (zh) 2019-04-16
EP3467709B1 (de) 2020-04-29
US20190108551A1 (en) 2019-04-11

Similar Documents

Publication Publication Date Title
EP3467709B1 (de) Gesichtserkennungssystem zur persönlichen identifizierung und authentifizierung
CN108038456B (zh) 一种人脸识别系统中的防欺骗方法
US10061996B1 (en) Face recognition method and system for personal identification and authentication
Liciotti et al. Person re-identification dataset with rgb-d camera in a top-view configuration
US20120140069A1 (en) Systems and methods for gathering viewership statistics and providing viewer-driven mass media content
CN105678591A (zh) 一种基于视频分析的商业智能化经营决策支撑系统和方法
US8805123B2 (en) System and method for video recognition based on visual image matching
AU2016266493A1 (en) Method and system for facial recognition
CN109074498A (zh) 用于pos区域的访问者跟踪方法和系统
CN111738199B (zh) 图像信息验证方法、装置、计算装置和介质
Nadhan et al. Smart attendance monitoring technology for industry 4.0
WO2015003287A1 (zh) 一种行为辨识及追踪系统及其运作方法
TW201502999A (zh) 一種行為辨識及追蹤系統
WO2021104388A1 (en) System and method for interactive perception and content presentation
JP6418270B2 (ja) 情報処理装置及び情報処理プログラム
WO2016035351A1 (ja) 情報処理装置、情報処理プログラム、情報処理方法及び記憶媒体
US9324292B2 (en) Selecting an interaction scenario based on an object
US20210182542A1 (en) Determining sentiments of customers and employees
JP2016045743A (ja) 情報処理装置およびプログラム
Lin et al. Face detection based on the use of eyes tracking
US20220269890A1 (en) Method and system for visual analysis and assessment of customer interaction at a scene
TWI726470B (zh) 影像辨識系統
Priyanka et al. Genuine selfie detection algorithm for social media using image quality measures
Živković Application for student attendance based on face recognition
JP6944020B2 (ja) 情報処理装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191009

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/46 20060101ALI20191030BHEP

Ipc: G06K 9/00 20060101AFI20191030BHEP

INTG Intention to grant announced

Effective date: 20191118

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1264423

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200515

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018004170

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200429

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200730

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200829

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200729

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200831

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1264423

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200729

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602018004170

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20210201

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602018004170

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201004

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20201031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210501

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201004

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211031

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211031

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20221004

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221004