EP2422325B1 - Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen - Google Patents

Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen Download PDF

Info

Publication number
EP2422325B1
EP2422325B1 EP10717088.8A EP10717088A EP2422325B1 EP 2422325 B1 EP2422325 B1 EP 2422325B1 EP 10717088 A EP10717088 A EP 10717088A EP 2422325 B1 EP2422325 B1 EP 2422325B1
Authority
EP
European Patent Office
Prior art keywords
atm
camera
image
teller machine
automated teller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP10717088.8A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP2422325A1 (de
Inventor
Steffen Priesterjahn
Dinh-Khoi Le
Michael Nolte
Alexander Drichel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wincor Nixdorf International GmbH
Original Assignee
Wincor Nixdorf International GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wincor Nixdorf International GmbH filed Critical Wincor Nixdorf International GmbH
Publication of EP2422325A1 publication Critical patent/EP2422325A1/de
Application granted granted Critical
Publication of EP2422325B1 publication Critical patent/EP2422325B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F19/00Complete banking systems; Coded card-freed arrangements adapted for dispensing or receiving monies or the like and posting such transactions to existing accounts, e.g. automatic teller machines
    • G07F19/20Automatic teller machines [ATMs]
    • G07F19/207Surveillance aspects at ATMs
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F19/00Complete banking systems; Coded card-freed arrangements adapted for dispensing or receiving monies or the like and posting such transactions to existing accounts, e.g. automatic teller machines
    • G07F19/20Automatic teller machines [ATMs]

Definitions

  • the invention relates to a self-service terminal with at least one image data-generating camera according to the preamble of claim 1.
  • the invention relates to a self-service terminal, which is designed as an ATM.
  • the spied out data is then transmitted to a remote receiver via a transmitter built into the keyboard superstructure, or is located in a keyboard overlay Data memory saved.
  • a transmitter built into the keyboard superstructure or is located in a keyboard overlay Data memory saved.
  • Many of today's skimming devices are very difficult to distinguish with the human eye from original controls (keyboard, card reader, etc.).
  • monitoring systems which have one or more cameras which are mounted in the area of the location of the self-service terminal and detect the entire control panel and often also the area of residence of the user.
  • Such a solution is for example in the DE 201 02 477 U1 described.
  • the local camera monitoring By means of the local camera monitoring, both the control panel itself and the user's area in front of it can be detected.
  • a sensor is provided in order to distinguish whether a person is in the occupied area.
  • a trained self-service terminal ATM is described with a camera that is mounted above the screen and detects at least a portion of the panel. The camera is aligned to capture the cash dispenser keypad.
  • a self-service terminal with a control panel and controls disposed therein is known in which a camera for detecting tampering attempts in a surrounding the control panel housing portion of the self-service terminal is mounted.
  • the control panel provided his elements (keyboard, card slot, etc.), which are provided for users of the self-service terminal, wherein at least one image data generating camera is provided for monitoring the self-service terminal.
  • the camera captures at least one of the elements provided in the control panel and generates image data from a plurality of still images.
  • the camera is connected to a data processing unit (micro-processor), which preprocesses the generated image data into a resulting image that serves to detect tampering attempts.
  • micro-processor data processing unit
  • the difference between two individual images is formed there in order to detect changes which may indicate a manipulation attempt.
  • a self-service terminal with a camera arrangement which is equipped with several cameras for monitoring the self-service terminal.
  • the self-service terminal which is eg an ATM
  • several cameras are mounted in the vicinity of the control panel of the ATM. For example, one camera captures the card slot and another camera captures the cash slot. Another camera is aimed at the user of the ATM.
  • the DE 203 18 489 U1 describes an ATM with a monitoring device, the two cameras (image pickup elements 21 and 22 in Fig. 2 ) having. One camera is aimed at the user; the other camera is aligned with the area of the cash dispensing opening or the cash dispenser.
  • Object of the present invention is to propose a solution for a camera surveillance, which allows a secure detection of manipulation attempts without the use of additional sensors. It should be a qualitative created a high-quality database and provided for the detection of manipulation attempts.
  • a self-service terminal in which the data processing unit combines the generated image data of the single image recordings by means of image data processing, which performs a segmentation and / or edge detection, to the result image, wherein the data processing unit segmented the single image recordings into a plurality of the at least one detected element subregions and the Frame data processed differently in segments.
  • the data processing unit composes the result image from the subareas of different individual image recordings, and that the data processing unit processes the image data from the subareas with different image data processing and / or with different variants of image data processing.
  • the at least one camera preferably generates the image data of the individual image recordings as a function of predefinable criteria, in particular at predeterminable time intervals and / or at different exposure ratios or ambient brightness. Also vorgebare camera settings, in particular exposure times andlor frame rates, are taken into account.
  • the data processing unit combines this image data (individual image data) by means of image data preprocessing, in particular averaging, median formation and / or so-called exposure blending, to the result image or overall image, which is then available for manipulation recognition. It is also possible to continuously calculate result or overall images (result image sequence) at time intervals in order to then be available for a comparison for the detection of manipulation attempts.
  • At least one further camera may be provided, which is also mounted on or in the self-service terminal in the vicinity of the control panel and at least one of the controls, such. Keyboard, card slot, cash dispenser, recorded.
  • the image data or individual images generated by this additional camera can also be combined together with the image data of the other camera to form a result image or to a result image sequence.
  • the result images obtained from the single image recordings have a significantly higher image data quality than the respective frames.
  • a high quality database in the form of preprocessed image data is provided for tamper detection.
  • the plurality of individual image recordings are created as a function of at least one predefinable function, which specifies different exposure times for the single image recordings. This ensures that no single images are taken with the same exposure time, which in turn is advantageous for exposure blending.
  • the at least one predefinable function corresponds to at least one ramp function which indicates increasing and / or decreasing exposure times for a series of single-image recordings.
  • the ramp can be sloping, ie, the exposure times successively decrease.
  • the total duration of all still images can also be specified and, for example, be 10 seconds.
  • one of the predeterminable functions predetermines the different exposure times such that they lie within a specific range of values, ie, for example, within a first lower value range, which, for example, ranges from 0.5 ms to 1000 ms. This value range is preferably suitable for a so-called day mode, ie for the case in which a brightness and / or contrast value of at least one of the individual image recordings has a predefinable threshold value exceeds.
  • the different exposure times are settled within a second upper value range, for example ranging from 1000 ms to 2000 ms.
  • the functions can also be put together to form a function.
  • the at least one camera generates the image data of the single image recordings as a function of events, in particular of events recorded by the latter or by another camera.
  • Such events may e.g. sudden onset of image lightening or darkening. It can e.g. also be control signals (operation of the keyboard or the like). It may be advantageous if single image recordings are not made (only) during the occurrence of the event, but also afterwards.
  • the data processing unit combines the generated image data of the single image recordings by means of one or more suitable image data processing, such as so-called exposure blending.
  • image data processing such as so-called exposure blending.
  • an image segmentation and / or edge detection can also be used.
  • the data processing unit segments the individual image recordings into a plurality of partial regions assigned to the at least one detected element and processes the individual image data differently in different segments.
  • the data processing unit composes the result image from the subareas of different single-frame images. It can also be provided that the data processing unit processes the image data from the subregions with different image data processing and / or with different variants of image data processing.
  • the partial regions preferably comprise at least one near or inner region and an ambient or outer region of the detected element, such as the slot region and the surrounding region of a card insertion funnel. It can also be provided that one of the partial regions comprises a transition region between the inner region and the outer region of the element.
  • the data processing unit is preferably designed such that it carries out both the image data preprocessing and the actual image data evaluation, ie that it calculates the preprocessed image data of the resulting image from the individual image data and evaluates these by means of image processing for the purpose of detecting manipulation attempts.
  • the data processing unit has a first stage, which receives the preprocessed image data, for the actual image processing or image data evaluation, wherein, in particular, shadow removal, edge detection, vectorization and / or segmentation can be carried out.
  • the data processing unit also has a second stage downstream of the first stage for feature extraction, in particular by means of blob analysis, edge position and / or color distribution.
  • the data processing unit has a third stage downstream of the second stage for classification.
  • the data processing unit is integrated in the self-service terminal.
  • the elements provided in the control panel of the self-service terminal and detected by the at least one camera include e.g. a cashbox, a keyboard, a tray, a card slot, and / or a screen.
  • the data processing unit if it detects a manipulation attempt on the detected elements by means of processing the preprocessed image data of the result image, triggers an alarm, blocks the self-service terminal and / or triggers the additional camera.
  • This additional camera may be a portrait camera, i. a camera which detects the area where a user, in particular his head, is during the operation of the self-service terminal. Thus, if necessary, a portrait of the user can be included.
  • the respective camera and / or the data processing unit is deactivated during the operation and / or maintenance of the self-service terminal.
  • the Fig. 1 shows in a perspective view of the basic structure of a self-service terminal in the form of an ATM ATM.
  • a cash dispenser 1 also called shutter
  • a keyboard 2 ie controls on which preferably manipulation attempts, eg in the form of superstructures, may occur for the purpose of skimming.
  • the ATM ATM is equipped with several cameras for detecting such and similar manipulation attempts.
  • the Fig. 1 shows first those cameras, which are mounted at different locations, preferably in the vicinity of the control panel. These are a side camera CAMS, a top view camera CAMD and an additional portrait camera CAMO.
  • the cameras CAMS and CAMD are within a demarcation, framing or the like and are mounted there. Each of these cameras CAMS or CAMD detects in each case from the outside at least one of the elements arranged in the control panel of the ATM, eg the cash dispenser 1 (shutter) and / or the keyboard 2.
  • the lateral camera CAMS preferably detects exactly these two elements 1 and 2; the top view camera CAMD also detects other elements (see also Fig. 6 ).
  • a camera CAMK integrated in the card input hopper 4 also detects the interior of this element. This camera CAMK and her Function will be explained later in detail on the basis of Fig. 7a / b described.
  • the additional camera CAMO is located in the upper housing part of the ATM ATM and is directed to the area in which the user is operating the ATM.
  • this camera captures CAMO the head or the face of the user and is therefore referred to here as a portrait camera.
  • the FIG. 2 shows the detection range of the camera CAMS, which is located in a side housing part, which framing the control panel of the ATM ATM or surrounds.
  • this camera CAMS is equipped with a wide-angle lens in order to capture at least these two elements or subregions of the control panel.
  • the ATM ATM is designed so that said elements 1 and 2 preferably have the most homogeneous surfaces with these delimiting edges. This simplifies object recognition. By attaching the camera CAMS at this particularly suitable position, the said subareas or elements 1 and 2 can be measured very reliably optical. It can be provided that the camera is focused in particular on certain areas.
  • FIG. 6 Another perspective, namely that of the top view camera CAMD, is based on the FIG. 6 clarified.
  • the detection field of this camera CAMD illustrated, which is installed in the upper part of the ATM ATM (see also Fig. 1 ) and which captures the control panel from above.
  • the Cash dispenser 1 and the keyboard 2 in the detection range of the camera also other elements may be provided, such as a shelf near the keyboard, a card input funnel 4, ie the feeding part for the card reader, and eg a screen 5 or display.
  • these further mentioned elements 3, 4 and 5 represent potential targets for manipulation attempts.
  • a result image or a result image sequence of high quality is calculated from a plurality of single image recordings.
  • Fig. 3a-c show by way of example three of the side camera CAMS (see FIG. Fig. 2 ) captured at different times frames F1, F2 and F3. From these, a result image R is computed by means of an image data preprocessing which will be described in more detail later Fig. 3d is shown.
  • each of the still images F1, F2 and F3 contains certain image distortions or aberrations due to, for example, reflection effects, poor ambient light, appearance of foreign objects in the form of persons and / or objects, etc.
  • These are schematic representations which illustrate the illustrate each recording situation.
  • the first still image F1 was taken with incident sunlight, causing disturbing reflections on the surface of the control panel in the area of the cash dispenser. This is illustrated here by a light beam coming from the left.
  • the frame F2 appears a Person covering the keyboard of the ATM.
  • frame F3 in turn, a foreign object or a foreign object appears in the background.
  • the result image R is combined by combining the individual image data, whereby the interfering effects are detected and eliminated by comparing the individual images with one another.
  • many subregions, except for the reflection region can be utilized from the individual image F1, with the individual image F1 reproducing the surface structure of the housing and the operating elements particularly well.
  • From the single image F2 also many sub-areas, except for the area of the keyboard and the environment in front of the ATM, can be utilized, in which case in particular the edges of the housing and the operating elements are rendered clearly recognizable.
  • the single image F3 also has many useful portions, in which case in particular the keyboard is reproduced without interference.
  • the result image R can then be calculated from the various subregions and the many image components of the individual images F1 to F3.
  • the result image does not represent a real image acquisition, but corresponds to an optimally calculated image composition that captures the captured area or the operating elements in a form freed from interference shows. This achieves a very high image quality, which clearly exceeds the quality of the individual images. Thus, an optimal basis for the later actual image data evaluation is created.
  • the plurality of individual image recordings can be created in dependence on at least one predetermined function, which specifies the different exposure times for the single image recordings. This ensures that no single image recordings are made with the same exposure time, which in turn is advantageous for exposure blending.
  • Fig. 9 schematically a series is shown with a plurality of still images F1 to Fn, where it is illustrated that each frame shot has a different exposure time T1, T2, ... Tn.
  • the row (exposure row) is preferably given in accordance with a monotone decreasing or increasing function, so that the following applies: T1 ⁇ T2 ⁇ T3 ⁇ ... ⁇ Tn.
  • the still images can also be performed depending on lighting conditions.
  • the exposure times may be dependent on various parameters, such as e.g. Location of the ATM (indoor, outdoor), type and / or mounting location of the camera, lighting conditions, etc ..
  • an edge detection can be used, as this is based on the Fig. 4 to illustrate the schematic representations:
  • Fig. 4 In the Fig. 4 are shown in a first row as sub-figures 4a1) to 4a3) three frame shots F1 'to F3', which at different exposure times of the lateral camera CAMS (s. Fig. 1 and 2 ) have been recorded.
  • This first row reproduces three differently exposed images, namely a1) a very brightly exposed image F1 ', a2) a normally exposed image F2' and a3) an underexposed image F3 '.
  • a second row are shown as sub-figures 4b1) to 4b3), the respectively obtained by means of edge detection individual images.
  • These edge images shown in b1) to b3) would have to represent white edge progressions on a black background.
  • these representations are reproduced here in an inverted manner, ie black edge gradients are displayed on a white background.
  • the Fig. 5a to 5c illustrate a further variant or additional measure for image data preprocessing of single image F1 ", F2", F3 "etc .
  • the image data is pixelwise subjected to a median formation Fig. 5a ) schematically shows the image data for the first pixel in each frame.
  • the first pixel in the image F1 has the value” 3 "
  • the image F2 the value "7”
  • the image F3 the value
  • the next images F4" and F5 have the first pixel location Value "5" or "4".
  • the result for the first pixel is a sequence of the following image data values: a sorting of the values according to their size follows, so that the following sequence results: 3, 3, 4 , 5 and 7.
  • the median of this sequence is thus the value "4".
  • This value is entered in the result image or target image R "at the first pixel location (s. Fig. 4c ).
  • the formation of the median value has the advantage over averaging (the average value would be "4.4") that the moving objects possibly present in individual images are completely eliminated.
  • the image data processing which can also take place on the basis of image data of a plurality of cameras, is carried out in a data processing unit which also performs the actual image analysis and which in the Fig. 8 is shown.
  • FIG. 8 shows the block diagram of a data processing unit 10 according to the invention, to which the cameras CAMS and CAMK are connected, as well as a CCTV unit 20, which is connected to the data processing unit 10.
  • the data processing unit 10 receives the image data D from the camera CAMS and the image data D 'from the camera CAMK. Both cameras take frames at predeterminable time intervals, the recordings being controlled by a pre-stage or control stage ST. In particular, the respective exposure time is given, so that a series of individual shots (exposure series) is created (see also later description of Fig. 9 and 10 ). Then, in a first stage 11, the preprocessing of the frame data follows. There, among other things result images are created on the basis of the image data processing methods or similar methods already described above.
  • the thus prepared image data D * has a very high quality and is then used as input data for a subsequent second stage 12 which serves for feature extraction.
  • a third stage 13 for the classification of the processed input data.
  • the stage 13 is connected to an interface 14, via which various alarm or monitoring devices can be activated or addressed. These devices include, among other things, image tampering detection (IFD).
  • IFD image tampering detection
  • the first stage 11 which serves for image preprocessing, is also connected to a second interface 15 via which a connection to the CCTV unit 20 is established. With the help of this CCTV unit, for example, a remote monitoring or remote diagnosis can be performed. The detection of manipulation attempts and the alarming will be described in more detail later.
  • Fig. 7a which illustrates a camera installation situation in which the CAMK camera is integrated directly into the card input hopper 4.
  • the already used illumination L of the card slot can be used.
  • the camera CAMK is mounted laterally by card slot or insertion slot, which is made of a special, light-conducting plastic K.
  • the lighting L is realized by one or more light sources, such as light-emitting diodes, wherein the generated light is guided via the light-conducting plastic K to the actual insertion slot in order to illuminate it.
  • the light can be guided coming from above and below, so that the card slot is illuminated as evenly as possible.
  • the generated light can be optimally adapted in intensity to the requirements.
  • the light can be colored by the use of colored LEDs and / or color filters to be adapted in particular to the requirements of the camera CAMK.
  • predefinable subregions are detected and optically measured. This allows deviations from reference values (normal state with regard to image composition, image content, weighting of pixel areas, etc.) quickly and reliably be recognized.
  • different image processing methods algorithms
  • image processing steps routines
  • the image data processing can take place partially.
  • the Fig. 7b 1 illustrates the detection area of the camera CAMK segmented into different partial areas, and clearly shows that it is essentially subdivided into three partial areas I, II and III.
  • the first subarea I primarily detects the inner area of the card input funnel, ie the actual card slot, the subarea III detects the outer area of the card entry funnel and subarea II detects the intermediate transition area.
  • the subarea III detects the outer area of the card entry funnel and subarea II detects the intermediate transition area.
  • the camera CAMK is here aligned so that the sub-area III also a person in front of the ATM are located (users or attackers) are detected can.
  • This image data can in particular with those of the portrait camera CAMO (s. Fig. 1 ).
  • the camera CAMK is preferably installed on the same side of the terminal as the camera CAMS, so that the image data of these two cameras can also be compared.
  • the lighting L (s. Fig. 7a ) are used to achieve the best possible illumination for the image recordings.
  • a colored illumination in the green area is particularly advantageous because the image sensors or CCD sensors of camera are particularly sensitive to green shades and have the greatest resolution.
  • the lighting L improves the object recognition, especially in low light conditions (location, night time, etc.).
  • the illumination overcomes possibly by external light (eg solar radiation) occurring reflections on a recognizable superstructure.
  • the already provided lighting L of the card insertion funnel is a reliable light source for the camera CAMK.
  • the actual card slot here has a different color than the card input hopper, so that a larger contrast difference is given, which improves the image analysis.
  • the CAMK camera is designed here as a color camera with a minimum resolution of 400x300 pixels. In the case of saturated illumination, it is thus possible in particular to use a color value distribution-based method for detecting superstructures and the like.
  • the camera CAMK has a wide-angle lens, so that the outdoor area (sub-area III in Fig. 7b ) is detected well.
  • FIG. 8 shows the block diagram of a data processing unit 10 according to the invention, to which the cameras CAMS and CAMK are connected, as well as a CCTV unit 20, which is connected to the data processing unit 10.
  • the data processing unit 10 has in particular the following stages or modules:
  • a pre-stage or control stage ST controls the single-frame images from the cameras to frame data D or D 'to produce, from which then by means of the above-described method preprocessed image data D * can be calculated for the actual data evaluation.
  • a first stage 11 for image processing thereof, a second stage 12 for feature extraction and a third stage 13 for the classification of the processed data are provided.
  • the stage 13 is connected to an interface 14, via which various alarm or monitoring devices can be activated or addressed.
  • These devices include, among other things, image tampering detection (IFD).
  • IFD image tampering detection
  • the first stage 11, which serves for image processing, is also connected to a second interface 15 via which a connection to the CCTV unit 20 is established. With the help of this CCTV unit, for example, a remote monitoring or remote diagnosis can be performed.
  • the control stage ST is responsible for the control of the cameras CAMS and CAMK for generating the individual image data D or D '.
  • the subsequent first stage 11 calculates therefrom the prepared image data D * (calculated total image data), wherein in particular measures such as shadow removal, edge detection, vectorization and / or segmentation are performed.
  • the downstream second stage 12 serves the feature extraction, which can be carried out for example by means of a so-called Blobanalysis, an edge positioning and / or a color distribution.
  • Blob analysis is used to detect contiguous regions in an image and to make measurements on the blobs.
  • a Blob (Binary Large Object) is an area of adjacent pixels with it logical state. All the pixels in a picture belonging to a blob are in the foreground. All other pixels are in the background. In a binary image, pixels in the background have values that correspond to zero, while every pixel other than zero is part of a binary object.
  • a classification is made which determines whether or not an enemy manipulation has occurred at the self-service terminal ATM on the basis of the extracted features.
  • the size and nature of the actions taken or countermeasures can be configured by the operator of the ATM via the system described here.
  • the cameras CAMS and CAMD can be provided directly on the control panel, with the cameras CAMS and CAMD detecting the control panel from the outside and the CAMK camera eg detecting the card input funnel from the inside.
  • an additional portrait camera can be installed (see CAMO in Fig. 1 ).
  • the cameras CAMS and CAMD are used on the control panel and the camera CAMK in the card input.
  • the portrait camera CAMO is also used.
  • all cameras have a resolution of at least 2 megapixels.
  • the lenses used have a viewing angle of about 140 degrees and more.
  • the exposure time of the cameras used in a wide range for example, 0.25 msec. up to 8,000 msec. freely adjustable. This allows adaptation to a wide variety of lighting conditions.
  • Applicant's experiments have shown that a camera resolution of about 10 pixels per degree can be achieved. Based on a distance of one meter, an accuracy of 1.5 mm per pixel can be achieved. This, in turn, means that manipulation starts at one Reference deviation of already 2 to 3 mm can be reliably detected. The closer the camera lens is to the detected element or object, the more accurate the measurement can be. Thus, in closer areas even an accuracy of less than 1 mm can be achieved.
  • the detection of the cash-out tray (shutter) 1 makes it possible to check manipulations in the form of so-called cash-trappers, i. special superstructures.
  • the detection of the keypad makes it possible to determine there manipulation attempts by superstructures or changes to light protection measures and the like.
  • the detection of the support surface makes it possible in particular to detect complete overbuilding.
  • the detection of the card input funnel 4, in particular by a camera integrated therein, makes it possible to detect local manipulations.
  • Deviations at the rear outer edge of the support surface can be detected as early as 4 mm. Deviations at the lower edge of the shutter can already be detected from 8 mm.
  • the data processing unit 10 in particular a comparison of the recorded image data D with reference data.
  • an image of the outside area can be examined for its homogeneity and compared with the image of the outside area of the control panel camera.
  • the connection of the system to the Internet via the interface 23 makes it possible to remotely control the camera or the various cameras.
  • the acquired image data can also be transmitted via the Internet connection to a video server.
  • the respective camera virtually acts as a virtual IP camera.
  • the CCTV unit 20 described above is used for such a video surveillance facility, wherein the interface 15 to the CCTV unit is designed for the following functions:
  • the system is designed so that no false alarms are generated by hands and / or objects in the picture during normal operation (eg withdrawing money, checking account balance, etc.). Therefore, the tamper detection in the period of a normal Machine usage disabled. Also, in time periods in which, for example, a cleaning or a short-term other use (storage of account statements, interactions before and after the start of a transaction) are not used for tamper detection. Essentially, therefore, only rigid and immovable manipulation attempts are preferably analyzed and recognized.
  • the system is designed to work in a wide variety of lighting conditions (day, night, rain, cloudy, etc.). Also, briefly changing light conditions, such as light reflections, shadows and the like, are compensated or ignored during image processing to avoid a false alarm. In addition, technically occurring events, such as the failure of a lighting and the like, are taken into account. These and other special cases are recognized and solved in particular by the third stage for classification.
  • the method for manipulation detection performed by the described system has in particular the following sequence (see also FIG Fig. 8 ):
  • preprocessed total image data D * are calculated from the original frame data D or D ', which serve as the basis for the actual evaluation of the data.
  • a picture is taken in a first step, wherein the camera parameters are adjusted to produce suitable recordings.
  • a series of images or corresponding image data D or D ' is recorded, which then serve as the basis or reference for the preprocessing.
  • Shadow removal, removal of moving objects, removal of noise and / or summary of various exposed shots are annoying and/or recommended to be used as a background.
  • the cameras are u.a. set to different exposure times to remove reflections and to collect well-lit areas. Preferably, the images are collected over a predetermined period of time to obtain the best possible output images for manipulation detection.
  • These steps can be performed by means of the recording control ST in the first stage 11.
  • a feature extraction (step 12) is carried out for the actual data evaluation, in which image analysis methods are carried out on the preprocessed images or image data in order to check these for certain features, such as e.g. on edge positions or color distributions.
  • a number or a value can be specified, which indicates how well the corresponding feature was found again in the viewed image.
  • the values are summarized in a so-called feature vector.
  • a classification is performed (step 13), ie the feature vector is passed to a classification procedure to make a decision to determine whether there is a manipulation or not. It also uses those types of classifiers that can indicate the confidence, ie probability or certainty, of the decision. Classification mechanisms used can be, for example:
  • the system described herein is preferably modular in design to allow for different configurations.
  • the actual image processing as well as the CCTV connection are realized in different modules (see FIG. 4 ).
  • the system presented here is also suitable for documenting the detected manipulations or digitally archiving them.
  • the captured images are provided with corresponding meta information, such as. Timestamp, type of manipulation, etc., stored on a hard disk in the system or in a connected PC.
  • messages may be forwarded to a platform, e.g. Error messages, status messages (deactivation, mode change), statistics, suspected manipulation and / or alarm messages.
  • a corresponding message with the respective alarm level can be forwarded to the administration interface or the interface.
  • the following options are also implemented at this interface:
  • Query of camera data such as number of cameras, construction status, serial number, etc., camera master data or Set camera parameters and / or register for alarms (notifications).
  • the invention presented here is particularly suitable for carrying out hostile manipulations on a self-service terminal, such as e.g. at an ATM, reliable to recognize.
  • the control panel is continuously and automatically monitored by at least one camera.
  • the elements detected by the camera are optically measured in order to detect deviations from reference data. It has been shown that even deviations in the millimeter range can be reliably detected.
  • a combination of edge detection and segmentation is preferably used, so that contours of left objects can be clearly recognized and marked. In the case of a manipulation attempt countermeasures or actions can be triggered.
  • the invention significantly increases the reliability with which manipulations can be detected.
  • the invention has the following camera arrangement:
  • the cameras are connected to the described data processing unit.
  • Within the data processing unit are those of the cameras obtained image data or information among other things used as follows:
  • Recognition or distinction of artificial and natural obscuration If a camera is covered, its captured image contradicts the images from the cameras. If natural or artificial light fails, this effect is equally noticeable on all cameras. Detection of deception attacks on the camera system, e.g. through pre-clicked photos: If a camera displays a different picture (different brightness, movement, colors, etc.), this indicates an attempt to deceive. Increasing the robustness of the cover detection on the card input hopper: If the card slot is hidden, the integrated camera CAMK displays a different image of the outside area than the other cameras do.
  • the preprocessing of the camera image data described here leads to an increase in the reliability of the subsequent data evaluation for the detection of manipulation attempts and accordingly also serves to avoid false alarms.
  • a self-service terminal which has at least one camera for detecting tampering attempts, which detects one or more elements provided in the control panel, such as a keyboard, cash dispenser, card slot, and generates image data from a plurality of single image recordings.
  • the at least one camera is connected to a data processing unit, which preprocesses the generated image data (single image data) into a result image.
  • the preprocessed image data of the result image can, for example are calculated by exposure blending from the frame data and provide a very good database for a data analysis for tamper detection.
  • the present invention has been described using the example of an ATM, but is not limited thereto, but can be applied to any type of self-service terminals.
EP10717088.8A 2009-04-22 2010-04-16 Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen Active EP2422325B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009018318A DE102009018318A1 (de) 2009-04-22 2009-04-22 Selbstbedienungsterminal mit mindestens einer Bilddaten erzeugenden Kamera zum Erkennen von Manipulationsversuchen
PCT/EP2010/055014 WO2010121957A1 (de) 2009-04-22 2010-04-16 Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen

Publications (2)

Publication Number Publication Date
EP2422325A1 EP2422325A1 (de) 2012-02-29
EP2422325B1 true EP2422325B1 (de) 2016-05-25

Family

ID=42667888

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10717088.8A Active EP2422325B1 (de) 2009-04-22 2010-04-16 Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen

Country Status (5)

Country Link
US (1) US9159203B2 (zh)
EP (1) EP2422325B1 (zh)
CN (1) CN102598072B (zh)
DE (1) DE102009018318A1 (zh)
WO (1) WO2010121957A1 (zh)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2974831B1 (en) * 2012-06-12 2021-04-07 Snap-On Incorporated An inventory control system having advanced functionalities
EP2736026B1 (de) * 2012-11-26 2020-03-25 Wincor Nixdorf International GmbH Vorrichtung zum Auslesen einer Magnetstreifen- und/oder Chipkarte mit einer Kamera zur Detektion von eingeschobenen Skimmingmodulen
JP2015012442A (ja) * 2013-06-28 2015-01-19 ソニー株式会社 撮像装置、撮像方法、画像生成装置、画像生成方法、及び、プログラム
US9342717B2 (en) * 2014-02-26 2016-05-17 Ncr Corporation Tamper detection system and method
US10515367B2 (en) * 2014-03-31 2019-12-24 Ncr Corporation Fraud detection in self-service terminal
CN105554391B (zh) * 2015-12-31 2019-05-14 广州广电运通金融电子股份有限公司 摄像头的控制方法及装置、金融设备终端
US11128839B2 (en) * 2016-01-29 2021-09-21 Ncr Corporation Image processing to identify conditions of interest within self-service terminals
US10956544B1 (en) 2016-04-01 2021-03-23 Massachusetts Mutual Life Insurance Company Access control through head imaging and biometric authentication
US10733275B1 (en) 2016-04-01 2020-08-04 Massachusetts Mutual Life Insurance Company Access control through head imaging and biometric authentication
US10346675B1 (en) 2016-04-26 2019-07-09 Massachusetts Mutual Life Insurance Company Access control through multi-factor image authentication
US10354126B1 (en) 2016-04-26 2019-07-16 Massachusetts Mutual Life Insurance Company Access control through multi-factor image authentication
US10643192B2 (en) * 2016-09-06 2020-05-05 Bank Of American Corporation Data transfer between self-service device and server over session or connection in response to capturing sensor data at self-service device
KR102102278B1 (ko) * 2018-10-29 2020-04-21 효성티앤에스 주식회사 금융자동화기기의 입출금장치 및 그 제어 방법
CN112085905B (zh) * 2019-06-14 2022-03-01 中电金融设备系统(深圳)有限公司 磁条卡阅读器、磁条数据处理装置及磁条数据处理方法
US11935055B2 (en) 2021-03-22 2024-03-19 Bank Of America Corporation Wired multi-factor authentication for ATMs using an authentication media
US11676460B1 (en) * 2022-02-04 2023-06-13 Ncr Corporation Currency trapping detection

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2351585B (en) * 1999-06-29 2003-09-03 Ncr Int Inc Self service terminal
DE20102477U1 (de) 2000-02-22 2001-05-03 Wincor Nixdorf Gmbh & Co Kg Einrichtung zum Schutz von SB-Automaten gegen Manipulationen
DE20318489U1 (de) * 2003-11-26 2004-02-19 Conect Kommunikations Systeme Gmbh Überwachungseinrichtung für Geldautomaten sowie Geldautomat
US20090201372A1 (en) * 2006-02-13 2009-08-13 Fraudhalt, Ltd. Method and apparatus for integrated atm surveillance
US20070200928A1 (en) 2006-02-13 2007-08-30 O'doherty Phelim A Method and apparatus for automated video surveillance
US7881497B2 (en) * 2007-03-08 2011-02-01 Honeywell International Inc. Vision based navigation and guidance system
JP4341691B2 (ja) * 2007-04-24 2009-10-07 ソニー株式会社 撮像装置、撮像方法、露光制御方法、プログラム
US8248481B2 (en) * 2009-04-08 2012-08-21 Aptina Imaging Corporation Method and apparatus for motion artifact removal in multiple-exposure high-dynamic range imaging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
CN102598072A (zh) 2012-07-18
US9159203B2 (en) 2015-10-13
WO2010121957A1 (de) 2010-10-28
US20120038775A1 (en) 2012-02-16
DE102009018318A1 (de) 2010-10-28
CN102598072B (zh) 2015-11-25
EP2422325A1 (de) 2012-02-29

Similar Documents

Publication Publication Date Title
EP2422325B1 (de) Selbstbedienungsterminal mit mindestens einer bilddaten erzeugenden kamera zum erkennen von manipulationsversuchen
EP2422328B1 (de) Selbstbedienungsterminal mit mindestens einer kamera zum erkennen von manipulationsversuchen
EP2422327A1 (de) Selbstbedienungsterminal mit kamera zum erkennen von manipulationsversuchen
WO2010121959A1 (de) Verfahren zum erkennen von manipulationsversuchen an einem selbstbedienungsterminal und datenverarbeitungseinheit dafür
DE69934068T2 (de) Bestimmung der position von augen durch blitzreflexerkennung und korrektur von defekten in einem aufgenommenen bild
EP2897112B1 (de) Verfahren und Vorrichtung zur Vermeidung von Fehlalarmen bei Überwachungssystemen
EP1395945B1 (de) Verfahren zur faelschungserkennung bei der fingerabdruckerkennung unter verwendung einer texturklassifikation von grauwertdifferenzbildern
DE602004005358T2 (de) Objektdetektion in bildern
DE102008039130A1 (de) Durch ein neurales Netzwerk gesteuertes automatisches Verfolgungs- und Erkennungssystem und Verfahren
EP2577620B1 (de) Vorrichtung zur echtheitsprüfung von wertdokumenten
DE102010011225B3 (de) Personendurchgangskontrolle mit Kamerasystem
EP2603905B1 (de) Verfahren und vorrichtung zum erkennen und verifizieren von manipulationsversuchen an einem selbstbedienungsterminal
DE112004000873T5 (de) Automatisches Unterscheidungssystem
EP2422324B1 (de) Selbstbedienungsterminal mit kamera-anordnung zum erkennen von manipulationsversuchen
CN109951637A (zh) 基于大数据的安防监控探头分析处理方法
DE20318489U1 (de) Überwachungseinrichtung für Geldautomaten sowie Geldautomat
EP1329856B1 (de) Verfahren zur Erkennung eines Prägebildes einer Münze in einem Münzautomaten
EP3782136B1 (de) Verfahren zur verifikation eines leuchtstoffbasierten sicherheitsmerkmals
AT503008B1 (de) Interaktives optisches system und verfahren zum extrahieren und verwerten von interaktionen in einem optischen system
EP1197926A2 (de) Verfahren zur Erkennung eines Prägebilds einer Münze in einem Münzautomaten
DE102008055884A1 (de) Verfahren und Vorrichtung zur Detektion einer zweidimensionalen Darstellung des Gesichtes einer Person

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20111103

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: PRIESTERJAHN, STEFFEN

Inventor name: DRICHEL, ALEXANDER

Inventor name: NOLTE, MICHAEL

Inventor name: LE, DINH-KHOI

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20140626

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160225

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

Ref country code: AT

Ref legal event code: REF

Ref document number: 802879

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160615

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502010011737

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160825

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160926

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160826

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502010011737

Country of ref document: DE

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20170228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170430

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170416

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170430

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20170430

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170416

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170430

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 802879

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170416

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170416

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100416

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160925

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 502010011737

Country of ref document: DE

Representative=s name: KILBURN & STRODE LLP, NL

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20230323 AND 20230329

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230321

Year of fee payment: 14

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20230525 AND 20230601

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230321

Year of fee payment: 14

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 502010011737

Country of ref document: DE

Owner name: DIEBOLD NIXDORF SYSTEMS GMBH, DE

Free format text: FORMER OWNER: WINCOR NIXDORF INTERNATIONAL GMBH, 33106 PADERBORN, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 502010011737

Country of ref document: DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240320

Year of fee payment: 15