EP2280382A1 - Systeme und Verfahren zur video- und positionsbasierten Identifizierung - Google Patents

Systeme und Verfahren zur video- und positionsbasierten Identifizierung Download PDF

Info

Publication number
EP2280382A1
EP2280382A1 EP10170564A EP10170564A EP2280382A1 EP 2280382 A1 EP2280382 A1 EP 2280382A1 EP 10170564 A EP10170564 A EP 10170564A EP 10170564 A EP10170564 A EP 10170564A EP 2280382 A1 EP2280382 A1 EP 2280382A1
Authority
EP
European Patent Office
Prior art keywords
individual
wireless communication
identification information
location
terminals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP10170564A
Other languages
English (en)
French (fr)
Other versions
EP2280382B1 (de
Inventor
Gideon Hazzani
Arie Briness
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Verint Systems Ltd
Original Assignee
Verint Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Verint Systems Ltd filed Critical Verint Systems Ltd
Publication of EP2280382A1 publication Critical patent/EP2280382A1/de
Application granted granted Critical
Publication of EP2280382B1 publication Critical patent/EP2280382B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/30Network architectures or network communication protocols for network security for supporting lawful interception, monitoring or retaining of communications or communication related information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Definitions

  • the present disclosure relates generally to surveillance systems, and particularly to methods and systems for combining video surveillance and location tracking information.
  • Video surveillance systems are deployed and operated in various applications, such as airport security, crime prevention and access control.
  • multiple video cameras acquire video footage, which is viewed and/or recorded at a monitoring center.
  • Mobile communication networks deploy various techniques for measuring the geographical locations of wireless communication terminals. Such techniques are used, for example, for providing Location Based Services (LBS) and emergency services in cellular networks. Some location tracking techniques are based on passive probing of network events generated by the wireless terminals. Other techniques are active, i.e., proactively request the network or the terminal to provide location information.
  • LBS Location Based Services
  • Some location tracking techniques are based on passive probing of network events generated by the wireless terminals.
  • Other techniques are active, i.e., proactively request the network or the terminal to provide location information.
  • An embodiment that is described herein provides a system, including:
  • the notification indicates a position within the images in which the individual is observed
  • the processor is configured to translate the position into an estimated geographical location of the individual and to identify the wireless communication terminals in the vicinity of the individual based on the estimated geographical location.
  • the processor is configured to extract identity attributes of one or more of the identified wireless communication terminals from the location information, and to retrieve the identification information responsively to the extracted identity attributes.
  • the identity attributes may include an International Mobile Subscriber Identity (IMSI) and/or a Temporary Mobile Subscriber Identity (TMSI).
  • the processor is configured to retrieve the identification information by querying a remote database with the identity attributes.
  • the identification information of a given wireless communication terminal includes personal information related to a subscriber of the given wireless communication terminal.
  • the processor is configured to present at least some the identification information to an operator.
  • a system including:
  • the system further includes an interface, which is operative to receive the location information from at least one communication network with which the wireless communication terminals communicate.
  • the system includes one or more interrogation devices, which are detached from any communication network with which the wireless terminals communicate, and which are configured to establish communication with the wireless communication terminals in the vicinity of the individual so as to produce the location information, and to provide the location information to the processor.
  • the interrogation devices are configured to extract from the communication identity attributes of the wireless communication terminals in the vicinity of the individual and to provide the identity attributes to the processor, and the processor is configured to retrieve the identification information responsively to the extracted identity attributes.
  • the identity attributes may include an International Mobile Subscriber Identity (IMSI) and/or a Temporary Mobile Subscriber Identity (TMSI).
  • IMSI International Mobile Subscriber Identity
  • TMSI Temporary Mobile Subscriber Identity
  • each interrogation device is associated with a respective camera, and has a coverage area that matches a field-of-view of the camera.
  • the processor when the individual appears in the images produced by multiple cameras, the processor is configured to present the identification information of a given wireless communication terminal only if the given wireless communication terminal established communication with at least two of the interrogation devices associated with the cameras in which the individual appears. In an embodiment, the processor is configured to present at least some the identification information to the operator using the operator terminal.
  • a method including:
  • Video surveillance systems typically collect video images from multiple video cameras, and present the images to a control center operator.
  • the operator may observe events of interest in the images and take appropriate actions.
  • Events that may trigger responsive action may comprise, for example, suspected criminal or terrorist activities.
  • events of interest involve individuals that are observed in the images.
  • the operator usually has no information as to the identity of the observed individuals. Such identity information is sometimes crucial for handling the event, e.g., for deciding on the appropriate responsive action.
  • Embodiments that are described herein provide improved methods and systems for video surveillance, which give the operator identification information regarding individuals that are observed in video images.
  • the identification information is produced by detecting one or more wireless communication terminals (e.g., cellular phones) that are located in the vicinity of the observed individual and are therefore likely to belong to the individual. Identification information associated with these terminals or their subscribers is obtained and presented to the operator.
  • wireless communication terminals e.g., cellular phones
  • a surveillance system comprises a video surveillance subsystem that presents video images, a location tracking subsystem that measures locations of wireless terminals, and a correlation processor that correlates the information provided by the two subsystems.
  • the operator points to an individual of interest observed in the images, such as by clicking a mouse over the individual in the displayed video images.
  • the correlation processor determines the geographical location of the individual, and identifies one or more wireless terminals located in the vicinity of this geographical location based on location information provided by the location tracking subsystem.
  • the correlation processor obtains identification information related to those terminals, and presents the information to the operator.
  • the correlation processor may focus on wireless terminals that are identified in the fields-of-view of two or more cameras, in order to reduce the likelihood of false identification.
  • the methods and systems described herein provide a powerful enhancement to video surveillance systems.
  • the operator is provided with identity information (e.g., name and phone number) of the observed individual in real time.
  • identity information e.g., name and phone number
  • responsive actions can be more effective, since they can be adapted to the specific identity of the individual.
  • the location tracking subsystem comprises a set of wireless interrogation devices.
  • Each interrogation device is able to query identity attributes (e.g., International Mobile Subscriber Identity - IMSI) of wireless terminals in its coverage area.
  • the coverage areas of the interrogation devices typically correspond to the fields-of-view of the cameras, so that detected wireless terminals are likely to correspond to individuals observed in the images.
  • the location tracking subsystem may obtain the identity attributes from the wireless network with which the terminals communicate.
  • the identity attributes may be obtained using passive probing techniques or using active location techniques.
  • Fig. 1 is a pictorial, schematic illustration of a surveillance system 20, in accordance with an embodiment of the present disclosure.
  • System 20 monitors individuals 24, some of which may be operating wireless communication terminals 28, in a certain area of interest 32.
  • Systems of this sort may be operated, for example, by law enforcement agencies, for applications such as anti-terrorism and crime prevention.
  • System 20 may be deployed in any suitable area of interest, such as, for example, a neighborhood or an airport.
  • System 20 comprises a video surveillance subsystem, which comprises video cameras 36 and a networked video server 40. Each camera has a certain field-of-view, which covers a particular sector in area 32. The cameras capture video images of their respective sectors and send the images to video server 40. Server 40 sends the images to a control center, in which the images are presented to an operator 42 on a display 43.
  • system 20 comprises four cameras denoted 36A...36D. Four image displays 44A...44D are presented simultaneously to the operator, showing the images captured by cameras 36A...36D, respectively. Alternatively, any other suitable number of cameras and image displays, as well as any other suitable way of displaying the captured images to the operator, can also be used.
  • System 20 further includes a location tracking subsystem, which measures the geographical locations of wireless communication terminals 28 in area 32.
  • Terminals 28 may comprise, for example, cellular phones, wireless-enabled computers or Personal Digital Assistants (PDAs), or any other suitable communication or computing device having wireless communication capabilities.
  • PDAs Personal Digital Assistants
  • Each terminal 28 communicates with a certain communication network (not shown in the figure).
  • the terminals tracked by the location tracking subsystem may belong to a single communication network or to multiple networks.
  • the networks and terminals may conform to any suitable communication standard or protocol, such as Long Term Evolution (LTE), Universal Mobile Telecommunication System (UMTS), CDMA2000 or other third generation (3G) cellular standard, Global System for Mobile communication (GSM) or Integrated Digital Enhanced Network (IDEN).
  • LTE Long Term Evolution
  • UMTS Universal Mobile Telecommunication System
  • CDMA2000 Code Division Multiple Access 2000
  • GSM Global System for Mobile communication
  • IDEN Integrated Digital Enhanced Network
  • the networks and terminals may conform to the IEEE 8
  • the location tracking subsystem comprises one or more interrogation devices 52 (referred to as interrogators for brevity), which are connected to a location processor 48.
  • a given interrogator 52 establishes communication with wireless terminals 28 in a given coverage area, in order to extract identity attributes of the terminals.
  • Each interrogator typically comprises a directional antenna, whose beam pattern (combined with the interrogator's transmission power) determines the coverage area.
  • a typical interrogator imitates the operation of a base station, and solicits a wireless terminal to start communicating with the interrogator.
  • the interrogator typically communicates with the terminal for a short period of time, during which it extracts the identity attributes of the terminal.
  • Each interrogator 52 sends the extracted identity attributes to location processor 48. For example, a given interrogator may force any terminal that enters its coverage area to perform a LOCATION UPDATE process, which reveals its identity.
  • Interrogation devices 52 may extract various identity attributes of the terminal, such as, for example, the terminal's International Mobile Subscriber Identity (IMSI), Temporary Mobile Subscriber Identity (TMSI), or any other suitable attribute indicating the identity of the terminal.
  • IMSI International Mobile Subscriber Identity
  • TMSI Temporary Mobile Subscriber Identity
  • the above-described attribute extraction functions can be carried out using known Interrogation devices, which are sometimes referred to as "IMSI catchers" or “TMSI catchers.” Examples of IMSI/TMSI catching techniques are described, for example, by Strobel in "IMSI Catcher," July 13, 2007 , which is incorporated herein by reference, by Asokan et al., in “Man-in-the-Middle Attacks in Tunneled Authentication protocols," the 2003 Security Protocols Workshop, Cambridge, UK, April 2-4, 2003 , which is incorporated herein by reference, and by Meyer and Wetzel in "On the Impact of GSM Encryption and Man-in-the-M
  • the TMSI catcher should be operated in combination with another system element (e.g., a passive probe), which translates TMSI values to IMSI values.
  • another system element e.g., a passive probe
  • System 20 comprises a correlation processor 56, which correlates the video images and the location tracking estimation, so as to provide operator 42 with identification information regarding individuals that are observed in the video images.
  • processor 56 accepts from operator 42 a notification, which indicates an individual of interest observed by the operator in the displayed video.
  • the notification indicates a position of the observed individual within the images.
  • This position (also referred to as an image position) may be expressed, for example, as a two-dimensional coordinate on display 43, as a pixel index in the image or in any other suitable way.
  • the operator indicates the image position using an input device 41, such as a mouse, a trackball or a keyboard. For example, the operator may move a cursor to the image position in which the individual is observed, and then press the mouse button.
  • operator 42 may indicate the observed individual's position in the displayed images to processor 56 using any other suitable means.
  • Processor 56 translates the image position indicated by the operator into an estimated geographical location (e.g., a geographical coordinate) of the individual.
  • processor 56 uses a predefined mapping of image coordinates (or pixel indices) into geographical coordinates. This sort of mapping translates each image position (e.g., X/Y screen coordinate or pixel index) in the field-of-view of a given camera into a respective geographical coordinate in area 32.
  • Processor 56 now interacts with the location tracking subsystem, in order to identify wireless communication terminals 28 that may be located in the vicinity of the observed individual's geographical location.
  • the assumption is that a terminal located nearby this location may belong to the individual of interest.
  • processor 56 indicates the individual's geographical location (as estimated based on the operator's notification) to location processor 48, and requests processor 48 to identify wireless terminals that are located in the vicinity of this geographical location.
  • processor 48 obtains location information from one or more interrogators 52 in order to identify the nearby terminals.
  • the estimated geographical location of the individual may be indicated to processor 48 in various ways, and not necessarily by reporting the screen coordinate indicated by the operator.
  • processor 48 may indicate to processor 48 the identity of the camera in which the individual of interest was observed, and processor 48 may identify the wireless terminals that established communication with the associated interrogator.
  • an indication that a certain terminal communicated with a certain interrogator is regarded as a sort of location estimation, since this indication implies that the terminal in question was located within the known and confined coverage area of the interrogator.
  • processor 56 provides the estimated geographical coordinate of the individual to processor 48.
  • Processor 48 selects one or more interrogators whose coverage areas contain this geographical coordinate, and identifies the terminals that established communication with these interrogators.
  • interrogators 52 extract identity attributes (e.g., IMSI) from the wireless terminals they communicate with.
  • the interrogators report the extracted identity attributes to processor 48, which in turn may report them to processor 56 in response to queries.
  • processor 56 requests the location tracking subsystem to identify wireless terminals located in the vicinity of the determined geographical location of the observed individual. In response to the request, the location tracking subsystem identifies such nearby terminals and reports their identity attributes (e.g., IMSI) to processor 56.
  • identity attributes e.g., IMSI
  • Processor 56 obtains identification information of the nearby terminals using the identity attributes reported by the location tracking subsystem. For example, processor 56 may query various databases using the reported IMSI, and retrieve various kinds of information regarding the wireless terminal and/or its subscriber. Identification information may comprise, for example, the telephone number assigned to the terminal, subscriber details such as name, address, credit card details, driving license details, nationality, passport number, and/or any other kind of identification information related to the terminal or its subscriber. In some embodiments, processor 56 may communicate with remote databases, external to system 20, for this purpose.
  • processor 56 evaluates the retrieved identification information, and may determine that the terminal in question is suspicious. The processor may report this suspicion to the operator. Processor 56 may apply any suitable criteria for determining whether a certain terminal is suspicious. For example, processor 56 may regard as suspicious terminals subscribed in hostile countries. Country information can be extracted, for example, from a country code that is part of the terminal's IMSI.
  • Correlation processor 56 presents the retrieved identification information to operator 42 using display 43.
  • the identification information may be projected onto the displayed image of the individual, or displayed in any other suitable way.
  • the location tracking subsystem identifies and reports multiple terminals 28 as possible candidates that may belong to the individual of interest.
  • processor 56 may present the identification information of any desired subset of these candidate terminals, or even of all candidate terminals.
  • Interrogators 52 may be operated in a free-running manner or on an on-demand basis. In free-running operation, each interrogator attempts to establish communication with any terminal in its coverage area, and capture the terminal's identity attributes, regardless of any request or trigger. The interrogators (or processor 48) record the extracted identity attributes and report them as needed. In on-demand operation, processor 48 triggers a given interrogator (or multiple interrogators) in response to a specific request from processor 56.
  • Interrogators 52 may be active or passive. In active operation, which is typical of interrogators that capture the terminal's IMSI, the interrogator proactively initiate a communication session with the terminals in its coverage area. In passive operation, which is common of interrogators that capture the terminal's TMSI, the interrogator passively monitors the terminal's operation without actively communicating with it.
  • active operation which is typical of interrogators that capture the terminal's IMSI
  • passive operation which is common of interrogators that capture the terminal's TMSI
  • the interrogator passively monitors the terminal's operation without actively communicating with it.
  • the techniques described herein can be implemented using active and/or passive interrogators.
  • Fig. 2 is a pictorial, schematic illustration of a surveillance system 60, in accordance with an alternative embodiment of the present disclosure.
  • system 60 uses location measurements acquired by one or more location tracking subsystems, which are part of the communication network with which terminals 28 communicate.
  • the location tracking subsystem may apply any suitable location tracking technique available in the network, or a combination of such techniques, in order to measure terminal locations.
  • Some location tracking techniques referred to as network-based techniques, are carried out by base stations 68 and other network-side components of the network, without necessarily using special hardware or software in terminals 28.
  • Other location tracking techniques are terminal-based, i.e., use special hardware or software in wireless terminals 28.
  • Some examples of location tracking techniques that can be used for this purpose are described in U.S. Patent Application Serial Number 12/497,799, filed July 6, 2009 , whose disclosure is incorporated herein by reference.
  • the location tracking techniques may be passive or active. Passive techniques perform unobtrusive probing of the signaling information transmitted in the network, and extract location information from the monitored signaling. Active techniques, on the other hand, proactively request the network or the terminal to provide location information. Typically although not necessarily, passive techniques are deployed in a massive, non-selective manner and produce large numbers of location records. Active techniques are typically deployed in a selective, on-demand manner. Active techniques are sometimes carried out by a Gateway Mobile Location Center (GMLC) of the communication network. In some embodiments, an active technique can be triggered to locate a terminal that has been identified in the area of interest by a passive technique. This feature often involves defining the area of interest as a separate "location area" in the location tracking subsystem, so as to cause the terminal to initiate a LOCATION UPDATE.
  • GMLC Gateway Mobile Location Center
  • system 60 comprises passive probes 64, which intercept communication events occurring in the network and extract location information from these events.
  • system 60 can be implemented using any other location tracking technique, such as the techniques described above.
  • the location information provided by the location tracking subsystem comprises identity attributes of the terminals, such as IMSI or TMSI.
  • correlation processor 56 accepts a notification from the operator of a control center 72, indicating an individual of interest observed in the video images produced by the video surveillance subsystem.
  • the notification indicates the image position of the individual, as described in Fig. 1 above.
  • Correlation processor 56 estimates the geographical location of the individual, and identifies one or more wireless terminals located in the vicinity of this geographical location.
  • Processor 56 identifies the terminals using the location information provided by probes 64.
  • processor 56 may request the location tracking subsystem to identify and report terminals that are located in the vicinity of the observed individual.
  • processor 56 may receive non-filtered location information from the location tracking subsystem, and identify the wireless terminals that are nearby the observed individual.
  • processor 56 Having identified one or more terminals 28 whose location is adjacent to the individual observed in the video images, processor 56 extracts the identity attributes (e.g., IMSI) of these terminals. Processor 56 obtains identification information of the terminals and/or their subscribers based on the extracted identity attributes. The identification information is presented to the operator, as described above.
  • identity attributes e.g., IMSI
  • configurations based on interrogators are particularly suitable for relatively small and confined areas of interest, such as airports.
  • Interrogator-based configurations are often capable of providing cell-id (referring to their own antenna) location estimation, and are independent of external elements such as cellular network providers.
  • Configurations based on network resources are generally more suitable for larger areas, such as in a crime prevention application that covers an entire city.
  • Figs. 1 and 2 are example configurations, which were selected purely for the sake of conceptual clarity. In alternative embodiments, any other suitable system configuration can also be used.
  • a surveillance system may use both dedicated interrogators and location tracking resources of the communication network.
  • Fig. 3 is a block diagram that schematically illustrates a surveillance system 80, in accordance with an embodiment of the present disclosure.
  • System 80 comprises a video surveillance subsystem 82, which comprises multiple cameras 36 connected to networked video server 40.
  • Subsystem 82 comprises a video records database 84, which stores captured video footage for off-line viewing and analysis.
  • Subsystem 82 also comprises a image-to-location mapping database 88.
  • Database 88 stores a predefined mapping of image coordinates to geographical coordinates for each camera 36.
  • Processor 40 (or processor 56) queries this database in order to translate an image position of an individual observed in the field-of-view of a given camera into a geographical location.
  • System 80 comprises a location tracking subsystem 92, which comprises location processor 48, one or more interrogators 52 and/or one or more probes 64.
  • a location records database 96 stores location records of wireless terminals, whose locations were measured by interrogators 52 and/or probes 64. The location records comprise, in addition to the measured locations, identity attributes of the terminals.
  • Correlation processor 56 comprises an image interface 100 for interacting with video surveillance subsystem 82, e.g., for receiving video images for display.
  • Processor 56 also comprises a location interface 104 for interacting with location tracking subsystem 92, e.g., for receiving location measurements and identity attributes of wireless terminals.
  • Processor 56 stores correlation results between video images, geographical locations, terminal identity attributes and identification information in a correlation result database 108.
  • Processor 56 obtains identification information (e.g., name, address or any other relevant information) from a personal information database 112, which can be queried using identity attributes such as IMSI.
  • identification information e.g., name, address or any other relevant information
  • Control center 72 comprises a map database 116 and a Geographic Information System (GIS), for presenting maps and other geographical layers to the operator.
  • GIS Geographic Information System
  • a Location-Based Application (LBA) server 124 combines the different types of information for display, e.g., map information from GIS 120, video images from subsystem 82 and identification information from correlation processor 56. The information is presented to the operator using an operator terminal 128. LBA server also transfers operator notifications regarding observed individuals to correlation processor 56. The interface between processor 56 and server 124 is thus bidirectional. Server 124 sends operator notification to processor 56, and receives from processor 56 video images for display and correlated identification information.
  • processors 48 and 56 and servers 40 and 124 comprise general-purpose computers, which are programmed in software to carry out the functions described herein.
  • the software may be downloaded to the computers in electronic form, over a network, for example, or it may, alternatively or additionally, be provided and/or stored on tangible media, such as magnetic, optical, or electronic memory.
  • Functions of processors 48 and 56 and servers 40 and 124 can be integrated and/or partitioned among any desired number of computing platforms.
  • some or all of the functions of correlation processor 56 can be integrated into LBA server 124.
  • some of the functions of video server 48 and/or of location processor 48 can be carried out by correlation processor 56, and vice versa.
  • multiple wireless terminals may be active in the vicinity of the individual of interest.
  • Multiple nearby terminals may be detected, for example, when the mapping of image positions to geographical locations has a relatively coarse accuracy or resolution, when the location tracking subsystem measures terminal locations at a relatively coarse accuracy (e.g. cell id), and/or when additional individuals are located nearby the individual of interest.
  • the likelihood of false identification i.e., the likelihood of presenting identification information that is unrelated to the individual of interest.
  • the likelihood of false identification can be reduced, for example, by narrowing the fields-of-views of cameras 36, and the coverage areas of the associated interrogators 52. A narrow coverage area will typically reduce the number of terminals identified by a given interrogator/camera pair.
  • processor 56 may reduce the likelihood of false identification by considering inputs from multiple cameras and interrogators.
  • processor 56 may select only terminals that are identified by two or more of the interrogators, which are associated with the cameras in which the individual was observed. Typically, only the identification information of these selected terminals is presented to the operator.
  • These techniques are also effective in tracking moving individuals.
  • individual 24 is shown in motion and is tracked at different locations at different points in time.
  • Fig. 4 is a flow chart that schematically illustrates a surveillance method, in accordance with an embodiment of the present disclosure.
  • the method begins with correlation processor 56 receiving a notification from operator 42, at a notification step 130.
  • the notification indicates an image position (e.g., screen coordinate) of an individual of interest, which the operator observed in the displayed video.
  • Processor 56 translates the image position into a geographical location, at a translation step 134.
  • processor 56 may query a predefined mapping of image positions to geographical locations, such as database 88 of Fig. 3 above.
  • Processor 56 uses the location tracking subsystem to identify one or more wireless terminals that are located in the vicinity of the geographical location of the individual in question, at a terminal selection step 138.
  • any suitable location tracking technique can be used for this purpose, which may use dedicated interrogators and/or resources of the communication network to which the terminals belong.
  • Processor 56 may apply various proximity criteria to determine which terminals are to be regarded as nearby the individual. For example, a proximity criterion may take into account the measurement accuracy of the location tracking technique being used.
  • Processor 56 extracts the IMSI (or other identity attributes) of the selected terminals from the location information provided by the location tracking subsystem, at an IMSI extraction step 142. Using the extracted IMSI of a given terminal, processor 56 obtains identification information regarding the terminal and/or its subscriber, at an information retrieval step 146. This information can be retrieved, for example, from database 112 of Fig. 3 above. Processor 56 presents the retrieved identification information to the operator, at a presentation step 150. In some embodiments, LBA server 124 presents the operator with a map display, over which the identified terminal locations and associated identification information are projected. The operator may take any appropriate action with respect to the observed individual in response to the displayed identification information.
  • the system may determine a set of cameras in which this individual appears. A corresponding set of interrogators can be triggered, such that each interrogator produces a list of IMSIs identified in its coverage area. For each camera in the set, the system selects one or more IMSIs whose estimated location is in the vicinity (e.g., within a certain radius) of the geographical location of the individual. The system then determines the intersection of the IMSI lists, i.e., the one or more IMSIs that appear in all the lists. The intersection of the ISMI lists forms the result set, whose identification information is subsequently presented to the operator. In some embodiments, IMSIs that appear in a certain number of the lists, but not necessarily in all the lists, can be regarded as belonging to the result set.
  • the embodiments described herein refer mainly to real-time operation, in which the displayed video reflects live events as they occur, and in which the operator is provided with identification information in real time.
  • the methods and systems described herein can also be applied in off-line applications.
  • video footage that is stored in database 84 can be analyzed off-line using the disclosed techniques.
  • the embodiments described herein refer mainly to stationary video cameras, i.e., cameras whose field-of-view is fixed. Alternatively, however, the methods and systems described herein can also be used with non-stationary camera, such as scanning cameras or cameras having multiple switchable fields-of-view. Some system elements, such as mapping database 88, should be adapted to account for such filed-of-view variations. Moreover, the methods and systems described herein are not limited to video cameras, and can be used with cameras that produce still images, as well. Although the embodiments described herein refer to networked video cameras that are controlled by a video server, the disclosed methods and systems can be implemented using autonomous cameras, as well.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Technology Law (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Telephonic Communication Services (AREA)
  • Alarm Systems (AREA)
EP10170564.8A 2009-07-26 2010-07-23 Verfahren zur video- und positionsbasierten Identifizierung Active EP2280382B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL200065A IL200065A (en) 2009-07-26 2009-07-26 Location and contract based identification systems and methods

Publications (2)

Publication Number Publication Date
EP2280382A1 true EP2280382A1 (de) 2011-02-02
EP2280382B1 EP2280382B1 (de) 2016-08-24

Family

ID=42263815

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10170564.8A Active EP2280382B1 (de) 2009-07-26 2010-07-23 Verfahren zur video- und positionsbasierten Identifizierung

Country Status (3)

Country Link
US (1) US9247216B2 (de)
EP (1) EP2280382B1 (de)
IL (1) IL200065A (de)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2744198A1 (de) * 2012-12-17 2014-06-18 Alcatel Lucent Videoüberwachungssystem unter Verwendung mobiler Endgeräte
EP3297276A4 (de) * 2015-05-12 2018-11-21 Hangzhou Hikvision Digital Technology Co., Ltd. Verfahren, system und verarbeitungsserver zur bestimmung von spurinformationen einer zielperson
WO2020114131A1 (zh) * 2018-12-06 2020-06-11 西安光启未来技术研究院 同行分析方法及装置
WO2021001769A1 (en) * 2019-07-02 2021-01-07 Verint Systems Ltd. System and method for identifying pairs of related information items
WO2022013593A1 (en) * 2020-07-13 2022-01-20 Telefonaktiebolaget Lm Ericsson (Publ) User identification based on location information from a third party

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140242948A1 (en) * 2013-02-26 2014-08-28 U-TX Ltd Method of linking a specific wireless device to the identity and/or identification measure of the bearer
EA201301010A1 (ru) * 2013-05-23 2014-11-28 Общество с ограниченной ответственностью "Синезис" Способ и система поиска видеоданных по идентификатору персонального мобильного устройства
US9565400B1 (en) * 2013-12-20 2017-02-07 Amazon Technologies, Inc. Automatic imaging device selection for video analytics
JP6573361B2 (ja) * 2015-03-16 2019-09-11 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法及びコンピュータプログラム
FR3034235A1 (fr) * 2015-03-25 2016-09-30 Sysint Systeme de surveillance a correlation d'images et d'identifications de telephones mobiles
US10789350B2 (en) * 2015-04-17 2020-09-29 Verifone, Inc. Computerized system and method for associating RF signals
IL241387B (en) * 2015-09-09 2020-07-30 Verint Systems Ltd System and method for identifying imaging devices
KR102432806B1 (ko) * 2015-10-26 2022-08-12 한화테크윈 주식회사 감시 시스템 및 그 제어 방법
US10080129B2 (en) * 2016-07-25 2018-09-18 Kiana Analytics Inc. Method and apparatus for integrated tracking of visitors
US11183452B1 (en) 2020-08-12 2021-11-23 Infineon Technologies Austria Ag Transfering informations across a high voltage gap using capacitive coupling with DTI integrated in silicon technology
US20230230379A1 (en) * 2022-01-19 2023-07-20 Target Brands, Inc. Safety compliance system and method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19920222A1 (de) * 1999-05-03 2000-11-09 Rohde & Schwarz Verfahren zum Identifizieren des Benutzers eines Mobiltelefons oder zum Mithören der abgehenden Gespräche
US20030032436A1 (en) * 2001-08-07 2003-02-13 Casio Computer Co., Ltd. Apparatus and method for searching target position and recording medium
US20040169587A1 (en) * 2003-01-02 2004-09-02 Washington Richard G. Systems and methods for location of objects
US20070268392A1 (en) * 2004-12-31 2007-11-22 Joonas Paalasmaa Provision Of Target Specific Information
EP1924117A2 (de) * 2005-08-23 2008-05-21 Thales Defence Deutschland GmbH Verfahren und Vorrichtung zum Identifizieren eines mobilen Endgeräts in einem digitalen zellulären Mobilfunknetz
US20080240616A1 (en) * 2007-04-02 2008-10-02 Objectvideo, Inc. Automatic camera calibration and geo-registration using objects that provide positional information
US20090054077A1 (en) * 2007-08-23 2009-02-26 Telefonaktiebolaget Lm Ericsson (Publ) Method and apparatus for sending data relating to a target to a mobile device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7305467B2 (en) * 2002-01-02 2007-12-04 Borgia/Cummins, Llc Autonomous tracking wireless imaging sensor network including an articulating sensor and automatically organizing network nodes
US7796154B2 (en) * 2005-03-07 2010-09-14 International Business Machines Corporation Automatic multiscale image acquisition from a steerable camera
US8385883B2 (en) * 2007-02-06 2013-02-26 Qualcomm Incorporated Apparatus and methods for locating, tracking and/or recovering a wireless communication device
US8116723B2 (en) * 2008-01-17 2012-02-14 Kaltsukis Calvin L Network server emergency information accessing method
IL198100A (en) 2009-04-07 2016-08-31 Intucell Ltd METHOD AND SYSTEM FOR GETTING INFORMATION ON A RADIO ACCESS NETWORK OF A MOBILE COMMUNICATION NETWORK
US8531523B2 (en) * 2009-12-08 2013-09-10 Trueposition, Inc. Multi-sensor location and identification

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19920222A1 (de) * 1999-05-03 2000-11-09 Rohde & Schwarz Verfahren zum Identifizieren des Benutzers eines Mobiltelefons oder zum Mithören der abgehenden Gespräche
US20030032436A1 (en) * 2001-08-07 2003-02-13 Casio Computer Co., Ltd. Apparatus and method for searching target position and recording medium
US20040169587A1 (en) * 2003-01-02 2004-09-02 Washington Richard G. Systems and methods for location of objects
US20070268392A1 (en) * 2004-12-31 2007-11-22 Joonas Paalasmaa Provision Of Target Specific Information
EP1924117A2 (de) * 2005-08-23 2008-05-21 Thales Defence Deutschland GmbH Verfahren und Vorrichtung zum Identifizieren eines mobilen Endgeräts in einem digitalen zellulären Mobilfunknetz
US20080240616A1 (en) * 2007-04-02 2008-10-02 Objectvideo, Inc. Automatic camera calibration and geo-registration using objects that provide positional information
US20090054077A1 (en) * 2007-08-23 2009-02-26 Telefonaktiebolaget Lm Ericsson (Publ) Method and apparatus for sending data relating to a target to a mobile device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ASOKAN ET AL.: "Man-in-the-Middle Attacks in Tunneled Authentication protocols", THE 2003 SECURITY PROTOCOLS WORKSHOP, 2 April 2003 (2003-04-02)
HERMANN KRUEGLE: "CCTV Surveillance", 1 January 2007, ELSEVIER, XP002612384 *
MEYER; WETZEL: "On the Impact of GSM Encryption and Man-in-the-Middle Attacks on the Security of Interoperating GSM/UMTS Networks", PROCEEDINGS OF THE 15TH IEEE INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS, 5 September 2004 (2004-09-05), pages 2876 - 2883
STROBEL, IMSI CATCHER, 13 July 2007 (2007-07-13)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2744198A1 (de) * 2012-12-17 2014-06-18 Alcatel Lucent Videoüberwachungssystem unter Verwendung mobiler Endgeräte
WO2014095155A1 (en) * 2012-12-17 2014-06-26 Alcatel Lucent Video surveillance system using mobile terminals
CN104871530A (zh) * 2012-12-17 2015-08-26 阿尔卡特朗讯公司 使用移动终端的视频监控系统
CN104871530B (zh) * 2012-12-17 2017-12-26 阿尔卡特朗讯公司 使用移动终端的视频监控系统
EP3297276A4 (de) * 2015-05-12 2018-11-21 Hangzhou Hikvision Digital Technology Co., Ltd. Verfahren, system und verarbeitungsserver zur bestimmung von spurinformationen einer zielperson
US10185023B2 (en) 2015-05-12 2019-01-22 Hangzhou Hikvision Digital Technology Co., Ltd. Method, system, and processing server for determining track information of target person
WO2020114131A1 (zh) * 2018-12-06 2020-06-11 西安光启未来技术研究院 同行分析方法及装置
WO2021001769A1 (en) * 2019-07-02 2021-01-07 Verint Systems Ltd. System and method for identifying pairs of related information items
IL267783B (en) * 2019-07-02 2022-11-01 Cognyte Tech Israel Ltd System and method for identifying pairs of related information items
IL267783B2 (en) * 2019-07-02 2023-03-01 Cognyte Tech Israel Ltd System and method for identifying pairs of related information items
WO2022013593A1 (en) * 2020-07-13 2022-01-20 Telefonaktiebolaget Lm Ericsson (Publ) User identification based on location information from a third party

Also Published As

Publication number Publication date
US20110018995A1 (en) 2011-01-27
IL200065A0 (en) 2010-04-15
US9247216B2 (en) 2016-01-26
IL200065A (en) 2013-11-28
EP2280382B1 (de) 2016-08-24

Similar Documents

Publication Publication Date Title
EP2280382B1 (de) Verfahren zur video- und positionsbasierten Identifizierung
US9979901B2 (en) System and method for automatic camera hand-off using location measurements
US9165288B2 (en) Inferring relationships based on geo-temporal data other than telecommunications
EP2302602B1 (de) Systeme und Verfahren für ortsabhängige multimediale Überwachung
JP7474757B2 (ja) モバイル航空機ドローンの早期警告プライバシー侵害検出、傍受、および防衛システムおよび方法
US20160335484A1 (en) Access point stream and video surveillance stream based object location detection and activity analysis
US9179259B2 (en) Recognizing unknown actors based on wireless behavior
US20130023247A1 (en) Location Intelligence Management System
CN105263142A (zh) 一种伪基站的识别方法及装置
KR101716070B1 (ko) 모바일 단말을 사용하는 비디오 감시 시스템
US9264447B2 (en) Simultaneous determination of a mobile device and its user identification
US9025833B2 (en) System and method for video-assisted identification of mobile phone users
US9237424B2 (en) System and method for correlation of mobile communication terminals and individuals at control checkpoints
US10447637B2 (en) Method and platform for sending a message to a communication device associated with a moving object
CN113194474A (zh) 伪基站的定位方法、装置、电子设备及可读存储介质
CN110557722B (zh) 目标团伙的识别方法及相关装置
EP2624534A2 (de) Systeme und Verfahren für die Korrelation zellulärer und WLAN-Identifikatoren von mobilen Kommunikationsendgeräten
US8344878B2 (en) Method and apparatus for the creation of an event-associated electronic device directory
CN110958608B (zh) 无线网络连接方法、装置、存储介质和计算机设备
WO2015196185A1 (en) Simultaneous determination of a mobile device and its user identification
WO2023020912A1 (en) Method and system for a communications network

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME RS

17P Request for examination filed

Effective date: 20110801

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602010035756

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G08B0013196000

Ipc: H04W0004020000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 7/18 20060101ALI20160224BHEP

Ipc: H04W 4/02 20090101AFI20160224BHEP

Ipc: G08B 13/196 20060101ALI20160224BHEP

INTG Intention to grant announced

Effective date: 20160314

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 824021

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160915

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010035756

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20160824

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 824021

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161124

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161226

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161125

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010035756

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161124

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20170526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170723

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170731

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170723

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100723

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161224

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602010035756

Country of ref document: DE

Owner name: COGNYTE TECHNOLOGIES ISRAEL LTD, IL

Free format text: FORMER OWNER: VERINT SYSTEMS LTD., HERZILYA PITUACH, IL

Ref country code: DE

Ref legal event code: R082

Ref document number: 602010035756

Country of ref document: DE

Representative=s name: STUETZ, JAN H., DIPL.-ING., DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230620

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230601

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230531

Year of fee payment: 14