EP1457050A1 - Procede et appareil de stockage de contenu video capte par plusieurs cameras - Google Patents
Procede et appareil de stockage de contenu video capte par plusieurs camerasInfo
- Publication number
- EP1457050A1 EP1457050A1 EP02803971A EP02803971A EP1457050A1 EP 1457050 A1 EP1457050 A1 EP 1457050A1 EP 02803971 A EP02803971 A EP 02803971A EP 02803971 A EP02803971 A EP 02803971A EP 1457050 A1 EP1457050 A1 EP 1457050A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- frame
- data
- frames
- digital image
- pattern recognition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/1968—Interfaces for setting up or customising the system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
- G08B13/19615—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19667—Details realated to data compression, encryption or encoding, e.g. resolution modes for reducing data volume to lower transmission bandwidth or memory requirements
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19686—Interfaces masking personal details for privacy, e.g. blurring faces, vehicle license plates
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19691—Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/117—Filters, e.g. for pre-processing or post-processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/146—Data rate or code amount at the encoder output
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/90—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/913—Television signal processing therefor for scrambling ; for copy protection
- H04N2005/91357—Television signal processing therefor for scrambling ; for copy protection by modifying the video signal
- H04N2005/91364—Television signal processing therefor for scrambling ; for copy protection by modifying the video signal the video signal being scrambled
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
Definitions
- the present invention relates a method and apparatus for storing digital video content provided from a plurality of cameras.
- Analog systems are widely used at present. This is in large part due to lower cost of analog equipment, in terms of cameras as well as overall cost per frame of image data. Accordingly, most surveillance systems that are currently in use, even if they have digital delivery devices such as digital cameras, at some point convert the digital information into an analog form, whether that analog form is required for real-time viewing on an analog monitor, or for analog storage. Thus, in conventional systems, there typically exists an analog switch or switches that configure the data being received from many different cameras to their respective monitor for viewing and/or videocassette recorder unit.
- the activity that is captured by a video surveillance system in general will depend on the environment that is being monitored. For many environments, there are often areas that require monitoring for activity over an extended period but that do not exhibit a great deal of activity over the course of the extended period. For example, a camera might be focused on the door to a bank vault for 24 hours a day, but might only capture relatively few individuals entering the vault or merely walking by the vault door. Under conventional arrangements, the surveillance data for the monitoring are typically stored in a number of manners. In analog systems, the surveillance data is typically stored in analog form on a videocassette recorder, as noted above.
- a surveillance system might essentially consist of an analog video camera hooked up to a remote video monitor as shown in FIG. 11 A, or an audio device hooked to a speaker, although the camera may also contain audio as well.
- the camera is pointed at a spot of interest, e.g., a front door, an automated teller machine, etc., and provides an image of that scene to the monitor.
- An operator watches the monitor to look for unusual or unauthorized behavior at the scene. If such activity is perceived, the operator takes appropriate action - identifying the individual, notifying security police, etc.
- the system may have one or many cameras, each of which can be displayed in a predetermined area of the monitor. Alternatively, the operator may toggle through the scenes. Further, instead of one or more analog cameras, the system may use digital cameras such as CCD cameras and the like. Such digital cameras have the advantage of providing a high-quality, low-noise image when compared to analog images.
- FIG. 11B Another possible video surveillance arrangement is shown in FIG. 11B.
- This system uses multiple cameras connected to the monitor via a controller.
- the controller can multiplex several camera signals and provide them to the monitor. Also, it can control the positions of the cameras.
- the operator uses an input device such as a keyboard, joystick or the like to direct the controller to control the motion of the cameras so they point to particular areas within their range, track interesting features in the images, etc. It may also use the input device to control the controller to direct the controller to provide particular ones of the camera signals to the monitor.
- FIG. 11C shows another arrangement of a video surveillance system.
- a video recording device is connected to the camera outputs, the monitor input, or both.
- the video recording device e.g., a video cassette recorder for analog cameras, can record the camera signals for archival, later review, and the like. Further, it can record images displayed on the monitor as evidence of activities taking place in the environments being inspected.
- the video storage device may be a digital storage device, a mass storage device such as a hard disk drive, or the like. When a hard disk drive is used, it may be a separate unit from the user controller and camera controller, or it may be part of an integrated system.
- the cameras are analog models, their signals may be stored on analog or digital storage devices.
- an analog storage device or devices such as video cassette recorders, the camera signal or signals are stored on videotape much like a television signal.
- the camera images are pixilated and stored in the digital storage device as data files.
- the files may be uncompressed, or they may be compressed using a compression algorithm to maximize use of the storage space.
- the present invention provides a distributed surveillance system that allows for the digital storage of data, as well as the recognition of external patterns obtained in parallel operations. Further, digital data can be stored at different levels of compression, and pattern recognition achieved while the digital data is still in its compressed form.
- the present invention described herein provide advantageous techniques for data frame adaptation to minimize storage size, source noise cancellation, and data frame delivery device source authentication in, for example, a surveillance system.
- the present invention describes methods and systems for adapting the size of a digital data frame to minimize data storage, for cancelling source noise resident in a digital data frame, and for authenticating the source of a digital data frame.
- One embodiment of the present invention provides a control program which controls the digital storage device.
- the control program monitors the status of the digital storage device. When the storage device (or portion thereof allocated for image storage) becomes full and new information needs to be added, the control program directs the storage device to delete information therein to make room for the new information. This is done based on various data parameters, such as the priority of individual messages or data units, the age of each message or data unit, and the like. For example, when information needs to be deleted from a digital storage device to make room for new information, older data of high priority cameras may be saved instead of newer data of low priority cameras. In this way, efficient use of the digital storage system can be made.
- the invention also includes a method of detecting an occurrence of an event that exists within a sequence of stored frames relating to a scene is described, as well as methods that operate upon the data relating to the scene once an event is detected.
- the invention also includes a method of detecting occurrence of an event includes comparing a first stored frame to a later stored frame to determine whether a change in size between the frames exists that is greater than a predetermined threshold.
- the first and later stored frames are compressed, and operated upon while compressed to determine the occurrence of an event.
- Also provided are methods of operating upon the data relating to the scene one an event is detected include providing a still image at intervals, typically every 5-10 frames using the data from at least the recording device that detected the event.
- still images from other recording devices that are associated in a predetermined manner with the recording device that detected the event are also obtained at intervals.
- monitor program which monitors images coming from one or more surveillance cameras. When an image or set of images satisfies certain conditions, the monitor program takes an appropriate action.
- a method of automatically monitoring a game of chance includes operating a video camera to obtain a stream of data that includes a plurality of repetitive actions stored thereon relating to the game of chance, and automatically parsing the stream of data to count the plurality of repetitive actions, the count obtained providing an indicator usable to monitor the game of chance.
- FIG. 1 illustrates a block diagram of the digital video content storage system according to at least one embodiment of the invention
- FIG.2 illustrates a block diagram of software modules used by different processors according to at least one embodiment of the invention
- FIG. 3 is a block diagram illustrating a transmission system according to at least one embodiment of the invention.
- FIGS. 4A through 4D are diagrams illustrate exemplary sizes adjustments to frames based on whether motion is or is not present in an area being monitored;
- FIG. 5 is a flow diagram illustrating an exemplary noise pattern discovery process according to at least one embodiment of the invention.
- FIG. 6 is a flow diagram illustrating an exemplary noise correction process according to at least one embodiment of the invention.
- FIG. 7 illustrates an exemplary system according to at least one embodiment of the invention
- FIG. 8 illustrates compressed frames produced by a camera which has an object pass through its field of view
- FIG. 9A illustrates a process for detecting and reacting to a change in the size of frames of a source
- FIG. 9B illustrates several possible examples of operations that can be performed in reaction to the detection of an occurrence of an event according to least one embodiment of the invention.
- FIG. 10 illustrates four screen displays in which a person caused a change in the size of frames to occur
- FIG. 11 A - 11C show various video surveillance system arrangements
- FIG. 12 shows a video surveillance system according to an embodiment of the present invention
- FIG. 13 shows a video surveillance system according to an embodiment of the present invention
- FIG. 14 illustrates a gaming table according to one embodiment of the present invention
- FIG. 15A-15B illustrates a player place setting having a bet area and a play area
- FIG. 16 illustrates a sequence of repetitive actions that are possible in game played in accordance with an embodiment of the present invention
- FIG. 17 illustrates a mask for a player place setting in which no cards and bets are present and a mask for a dealer setting in which no cards are present;
- FIG. 18A illustrates a roulette layout and mask for usage with the roulette layout
- FIGS. 18B-18C illustrates a roulette wheel and mask for usage with the roulette wheel
- FIG. 19 illustrates a sequence of repetitive actions for a roulette wheel and ball
- FIGS. 20A -20B illustrate exemplary reports generated from repetitive actions being monitored
- the present invention provides a distributed surveillance system and methods of using the system.
- Features included are digital storage of surveillance information, both visual and audible, pattern recognition of external patterns within obtained streams of surveillance information from different sources, distributing external patterns to various computer locations where such external patterns can be used for searching, interpreting automatically patterns of repetitive activity and generating reports therefrom, distributing individual images that are automatically determined to contain a particular image of interest, establishing pattern recognition rules to detect activity of interest, as well as others described herein.
- FIG. 1 illustrates an exemplary system 100 according to the present invention, and various different devices that will allow for the various permutations described herein to be understood, although it is understood that this exemplary system 100 should not be construed as limiting the present invention.
- FIG. 1 illustrates a plurality of conventional cameras 110-1 to 110-n, analog or digital, each of which is connected to a computer 120-1 to 12-m and preferably contains systems for detecting both images and sound.
- the connections between the cameras 110 and the computers 120 can be many, being shown as both connected in a one-to-one correspondence, as well as a number of the cameras 110 being connected to a single computer 120.
- a computer 120 is also shown as not being connected to any camera, to illustrate that digital data stored in any digital format can be operated upon by the present invention, such as being stored on a computer.
- the computers 120 are each preferably coupled using a network 150 of some type, such as the Internet or a private network, to a central server 130.
- a network 150 of some type, such as the Internet or a private network
- Each of the computers 120, and the cameras 110 connected thereto, can be in any number of disparate locations. While the computer 120 connected to a camera 110 are preferably within or close to the same building in order to minimize the distance that signals from the camera 110 must travel until they reach the computer 120, the cameras can be on different floors of the same building, and the computers 120 can be within different buildings. And while the system is illustrated for convenience herein as having a single central server 130, it will be appreciated that any or all of the computers 120 could be configured to act as the central server described herein.
- the central server 130 at a location that is remote from the location where the surveillance is being performed, and even more preferable to have that remote location be in a physically separate building.
- a particular company may have its own server 130, and, if desired, each of different servers 130 can be connected together through the network 150 to permit sharing of certain data therebetween.
- the system 100 will be described with reference to a single server 130, which can server a multitude of different companies, to illustrate the most flexible aspects of the system 100 described herein.
- computers 140-1 to 140-n are also illustrated as being coupled to the server 130 through the network 150.
- Data received by computers 140 can be decrypted and decoded for subsequent usage, such as being perceived visibly using monitor 142 and audibly using speaker 144 shown.
- This particular exemplary system 100 thus allows the cameras 110 to provide at least images, and sound if desired, which can be encoded and stored for surveillance purposes.
- This data can be stored in digital form on a computer 120 or a computer 130, as described further hereinafter.
- patterns within images or sound obtained from cameras 110 that compare to external patterns can also be recognized, as will be described further herein, using either the computer 120 provided with the image data from the camera 110, the server 130, or both.
- Transmission of data between each computer 120 or 140 and server 130 is also preferably encrypted, as described further hereinafter.
- the data is decrypted in order to operate on it.
- each of the different computers 120, 130 and 140 could be implemented as a single computer and devices other than computers that contain processors that are able to implement the inventions described herein can be used instead. Many other variants are possible.
- a device such as mentioned will typically include a processor of some type, such as an Intel Pentium 4 microprocessor or a DSP in conjunction with program instructions that are written based upon the teachings herein, other hardware that implements the present invention can also be used, such as an field programmable gate array.
- the program instructions are preferably written in C++ or some other computer programming language. For repetitive routines that are used repeatedly, these can be written in assembler for faster processing.
- the present invention also includes software 200 that is resident on each of the computers 120 and 140, as well as the server 130, to provide for the variety of functions described herein. It will be appreciate that various modules 210 of the software 200 are resident on different computers, thereby allowing for the functions as described herein to be achieved.
- modules 210 that are typically contained within and operate in conjunction with the computers 120 are:
- -local pattern recognition module 210-2 which, as described hereinafter, allows for both internal pattern recognition and external pattern recognition as described hereinafter;
- Modules 210 that are typically contained within and operate in conjunction with the server 130 are:
- -network pattern recognition module 210-12 which, as described hereinafter, allows for both internal pattern recognition and external pattern recognition, and includes network priority information, event alert generation, and repetitive action pattern analysis and associated report generation, as described hereinafter;
- Local front end processing module 210-1 local user interface module 210- 5. and network interface module 210-15
- the local front end processing module 210-1, local user interface module 210-5. and network interface module 210-15 will first be described.
- the surveillance system 100 allows for recording devices, typically cameras 110, to be placed at various locations associated with a building, and allows for buildings at disparate locations to be served.
- each building will contain at least one local computer 120, although the number of local computers needed will vary depending upon the amount of processing power desired relative to the number of recording devices and other processing operations that take place at the location itself.
- the local user interface module 210-5 is used to set up and keep track of connections between cameras that are included on the local system.
- a single local interface module typically associated with a single company, will keep track of company information, camera information, and user information.
- Company information tracked will typically include the company name, and the address information for each building of which surveillance is required.
- Camera information that is associated with a particular building, includes the model of the camera, a camera priority, a camera identifier that can be used by both the local system as well as the network system, the location of the camera, which can include not only a location that it views, but reference to a particular activity, such as a particular gaming table number in an embodiment for usage at a casino, a gaming type identifier if the camera is used for a gaming table, an operation status identifier (functioning or out-of-service), and a camera purpose identifier, which can be, for example in the context of usage at a casino, whether the camera is used for gaming or non-gaming purposes. How these are used will be further described hereinafter.
- User information identifies users who can access the system, and the degree of access that they have to the system. Certain users will have view access only, which may be of only certain views or all views, while other users will have access to make changes, run reports, and the like as described further herein.
- the individual camera 110 With respect to the set-up of each camera, the individual camera 110 not only needs to be set-up and connected with the system using the local user interface module 210-5, but also needs to be configured to operate properly.
- the local front end processing module 210-1 is preferably used to properly configure the camera to operate as effectively as it can. Configuring the individual camera 110 using the local user interface module 210-5 is described below with respect to FIGS. 3-6 and in U.S.
- the local user interface module 210-5 is also configured to transmit the information relating to the local system to the network interface module 210-15. It is understood that the network interface module 210-15 will preferably contain the same information as exists with respect to each different local user interface module 210-5. In addition, since the network interface module 210-15 receives information from a multitude of different computers 120, the network interface module 210-15 will also include network related features, such as network establishment of network external patterns desired for pattern recognition as described further herein, network camera priority rules, network back- up procedures, and network report generation. The manner in which these various local and network features are implemented will be described hereinafter, but it will be understood that the local user interface module 210-5 and the network interface module 210-15 together allow for access to the results of the operations performed by the other modules 210 described hereinafter.
- the local user interface module 210-5 when setting up a camera, will establish a priority of that camera, which priority scheme can take many different forms, one being a 0-10 priority, with 0 being the highest priority and 10 being the lowest priority.
- the local priority setting there may be another network priority setting, thus ensuring that certain types of data are transmitted from the computer 120 to the server 130.
- network camera priority settings also exist, which cause the network interface module 210-15 to initiate transfers of data from certain cameras 110 at a higher priority than other cameras 110.
- the local pattern recognition module 210-2 and network pattern recognition module 210-12 each provide for the capability to recognize patterns within data.
- that data includes image data that is formatted in frames. Patterns recognized include patterns within the data, as well as external patterns that are externally generated and being searched for within the data, as described further hereinafter.
- Both the local pattern recognition module 210-2 and network pattern recognition module 210-12 can use a variety of pattern recognition techniques, but preferably use pattern recognition techniques as described in U.S. Appln. No. bearing attorney reference number 042503/0259665 entitled “Method And Apparatus For Determining Patterns Within Adjacent Blocks Of Data,” filed on October 31, 2001, which is assigned to the same assignee as the present invention, can be used to perform pattern recognition and compression, and the contents of which are expressly incorporated by reference herein.
- the network pattern recognition module 210-12 can operate upon data that has been consistently compressed. Compression is preferably achieved using the techniques described in Appln. No. 09/727,096 entitled “Method And Apparatus For Encoding Information Using Multiple Passes And Decoding In A Single Pass” filed on November 29, 2000. In this compression scheme, as noted in the section below, recorded data can be subjected to multiple passes to achieve further compression.
- Both the local pattern recognition module 210-2 and the network pattern recognition module 210-12 also provide for the recognition of external patterns within images that are externally generated. So that external patterns can be detected ⁇ respective of which amount of compression is used, as noted in further detail below, the external patterns are stored at each of the different levels of compression that exist.
- the network interface module 210-15 in conjunction with the local pattern recognition module 210-2 and the network pattern recognition module 210-12, is used to identify and keep track external patterns desired for pattern recognition and the priority of that external pattern.
- an external pattern will represent an object, which object could be the face of a person, a particular article such as a purse, card, a vehicle or a vehicle license plate.
- the present invention will store a representation of that object and compare use that representation to compare with patterns that are within the image recorded.
- a limiting factor in the ability of the system 100 to track external patterns is that the system 100 is already obtaining data from various cameras 110 and compressing that data as described above. In order to perform that task alone, substantial processing power is required, leaving only some percentage, based upon the computing power available, to track external patterns of objects.
- the network interface module 210-15 will keep track of the priority of each external pattern that will be searched for based upon the input from each camera 110. Certain of the highest priority external patterns are distributed to computers 120 in an uncompressed form, using a sufficient number of points, such as 25, that allows for sufficiently accurate pattern detection in vector form using (x,y) offsets as is known, for pattern recognition that takes place using another processor thread for the purpose of searching for a particular pattern, as described in U.S.
- computers 120 are typically operating upon real-time images that are received, it is apparent that the external patterns located by computers 120 will be obtained more quickly than external patterns found by server 130, which external patterns need not be searched for in real time. If, however, the computer 120 cannot complete its search for external patterns, it will so notify server 130, which will then preferably search for all desired external patterns and assume that the computer 120 did not find any.
- the local multi-pass module 210-3 and network multi-pass module 210-13 operate to provide further compression from that obtained by the local pattern recognition module 210-2 and network pattern recognition module 210-12. That further compression is preferably achieved using the techniques described in Appln. No. 09/727,096 entitled “Method And Apparatus For Encoding Information Using Multiple Passes And Decoding In A Single Pass” filed on November 29, 2000, and assigned to the same assignee as the present invention, the contents of which are expressly incorporated by reference herein, or other encoding/decoding processes. [00057] While each of the local multi-pass module 210-3 and network multi-pass module 210-13 can be configured to operate in the same manner, typically that is not the case.
- the local multi-pass module 210-3 is configured to perform some predetermined number of passes on the data that it receives in order to partially compress that data before it is transmitted by the computer 120 to the server 130 for further operations, including further compression operations performed by the network multi-pass module 210- 13 using further passes to further compress the data.
- each of the local multi-pass modules 210-3 using the same number of passes and the same compression routines, so that the data received by the server and further operated upon using the network multi-pass module 210-13 is already consistently compressed between the various sources from which it receives data.
- the network multi-pass module 210-13 will preferably contain additional multi-pass compression routines not found in the local multi-pass modules 210-3 that allow for further passes to occur.
- passes 1 and 2 are performed by the computer 120, whereas further passes are performed by the server 130.
- data can be saved with a user-specified amount of compression. Since images may be recorded at different levels of compression, for each different compression level there must be an associated compression of all of the external patterns. Thus, if images are recorded at one of 1, 2, 5, 10, 15 or 20 passes, then external patterns must be obtained for each of 1, 2, 5, 10, 15 and 20 passes, so that the appropriately compressed external pattern can be used during comparison operations depending upon the compression of the image.
- the local encryption decryption module 210-4 and network encryption/decryption module 210-14 each perform the same functions—encrypting and transmitting data to a source destination, and receiving and decrypting data that has been previously encrypted and transmitted. While many encryption decryption techniques exist, one technique that is advantageously implemented is described in U.S. Application No. 09/823,278 entitled “Method And Apparatus For Streaming Data Using Rotating Cryptographic Keys," filed on March 29, 2001, and assigned to the same assignee as the present invention, the contents of which are expressly incorporated by reference herein. 9. LOCAL PRIORITY DATA STORAGE MODULE 210-6 AND NETWORK PRIORITY DATA STORAGE MODULE 210-16
- the local priority data storage module 210-6 and network priority data storage module 210-16 keep track of the data stored on each of the computers 120 and server 130, respectively. These priority data storage modules are different than data backup, and assume a worst-case scenario that no data backup has occurred. In essence, both the local priority data storage module 210-6 and network priority data storage module 210-16 operate in the same manner —to keep that data which is most important. Specifics on how data is differentiated and these modules operate are described in U.S. Application No. bearing attorney reference number 042503/0273342 entitled "System And Method For Managing Memory In A Surveillance System" filed on the same day as this application, and which is assigned to the same assignee as the present invention, the contents of which are expressly incorporated by reference herein.
- each local user interface module 210-5 can operate if not connected to the network, with the resident software continuing to perform the functions that it typically would in conjunction with it associated computer 120.
- the data saved, should prioritizing be necessary, will be in accordance with the patent application refened to in the immediately above paragraph.
- the surveillance system as described herein describes security in the context of a casino, which includes gaming tables, vault areas, and other common areas requiring surveillance. It will be understood, however, that the present invention can be implemented in any environment requiring security, including financial institutions, large commercial buildings, jewelry stores, airports, and the like.
- the initial gate entry area represents one level of security
- the actual gate can represent another level of security.
- an entry one can represent one level of security and a vault within represent another level of security. Comparisons of images from these related areas, and generating alerts and other information based thereupon can be used in environments further described below. Certain characteristics of each environment can be used in making comparisons. For example, in a jewelry store, comparisons can be made between a mask of an unbroken glass display case, such that if the case breaks, an alert can be sounded.
- the present invention can also be adapted for surveillance in environments where a wired connection between the computer 120 and the server 130 is not possible, such as on an airplane. In such an environment, the available transmission bandwidth becomes even more of a limiting factor than in a wired network. Accordingly, in such an environment, the computer 120 will typically be limited to performing the compression operations described above and wirelessly transmitting the compressed information, and pattern recognition and other operations will take place at server 130 which is adapted to receive the wirelessly transmitted data.
- the embodiments describe methods and systems for adapting the size of a digital data frame to minimize data storage, for conecting source noise resident in a digital data frame, and for authenticating the source of a digital data frame.
- FIG. 3 it is a block diagram illustrating an exemplary transmission system 300 according to the present invention, and various different devices that will allow for the various permutations described herein to be understood, although it is understood that this exemplary transmission system 300 should not be construed as limiting the present invention.
- the system 300 includes source data delivery devices 310, for example, conventional cameras 310-1 to 310-N, each of which is connected to a computer device 320 at a data interface 318 via respective transmission equipment 316-1 to 316-N.
- the source data delivery devices 310- 1 to 310-N preferably contain systems for detecting both images and sound, although devices that can reproduce images or sound but not both are also within the scope of the present invention.
- the source data delivery devices 310 can be analog or digital.
- source delivery devices 310 generate noise that becomes overlaid onto the recorded signal.
- delivery devices that are most susceptible to producing large amounts of noise are those devices 310 that record images, in other words a camera.
- high quality cameras that produce only slight amounts of such noise cameras used in many surveillance environments are often of a low grade quality. As such, the cameras often generate a substantial amount of noise that is overlaid onto the actual image that is being recorded.
- this noise results from a combination of the internal elements that are used to record the image, including the optical systems, transducers, digital circuits, the power source and AC/DC converters, and the like. It has been found, however, that once a camera has been turned on for a period of time, it reaches a steady state operation, such that the noise will repeat in a cyclic noise pattern.
- the present invention as described hereinafter, exploits this property to eliminate cyclic noise from the recorded image.
- certain aspects of the present invention correct for the noise signature of devices 310 such as cameras.
- a conventional arrangement for an analog camera device 310 includes analog transmission equipment 316A that includes analog transmission lines and amplifiers placed at lengths along the analog transmission lines to refresh the analog signals as suitable.
- a typical arrangement for a digital camera device 310 includes digital transmission equipment 316D that typically includes only an optical transmission line, as digital signals can travel along an optical transmission line distances that are much greater than analog signals can travel, as is known.
- the data interface 318 in many cases will include an analog to digital (A/D) converter after the analog switch so that analog signals output from the switch can be converted to digital form for input into the computer 320. And for such systems which contain digital recording devices 310, before the analog switch there exists a digital to analog (D/A) converter that converts the digital signals to analog form, so that they can be operated upon by the analog switch.
- A/D analog to digital
- D/A digital to analog
- the transmission medium 316 will also contribute noise as well, particularly from signal degradation and amplifier distortion in the analog context, and from digital to analog conversion and analog to digital conversion in the digital context.
- devices 310 and particularly low quality cameras, generated a cyclic noise pattern, which pattern is further altered as a result of the transmission medium 316.
- One obvious component of this noise tends to be from power used to drive the electrical components. While a DC voltage is typically used to drive circuit components, this DC voltage is typically obtained as a result of a conversion from an AC source, which in the United States oscillates at 60 Hz/sec. Thus, this AC noise becomes one component of the source noise, and can have a particularly severe effect since most image devices 310 record images at 30 frames/sec, a frequency that is relatively close to the oscillating frequency of the AC power signal.
- step 310 an initial set-up is first preferably done, so that the recording device 310 and the transmission medium 316 associated with that device are in place. This ensures stability of the initializing routine.
- step 510 the initializing steps are begun, with the first initializing step 520 being to rum on the device 310 after the computer 320 is configured to record the output of the device 310. It is noted that in this initial configuration, the amount of time that the device 310 will require to heat up before it exhibits a cyclic noise pattern is unknown.
- the camera When initially turned on, the camera records the image taken from a known color pattern, such as a known white blotter. Initially, as shown by step 530, the image is recorded for some number of frames, typically in the range of 200-300 frames. Each of these frames is then compared against a stored "white" image that contains pixel representations corresponding to the actual known color pattern to obtain a difference frame, as shown by step 540. In the following step 550, these difference frames are compared against one another to determine if there is any repetition of patterns between them. While conventional pattern recognition algorithms can be used, preferably the pattern recognition algorithm described in U.S. Application No.
- each frame can be a reference frame, and be compared to each of the other frames, with each of the other frames being a target frame for purposes of that comparison.
- each frame can be designated a reference frame with the other frames being target frames, although it will be appreciated that such a number of comparisons leads to redundant comparisons, and thus a lesser number of comparisons is needed.
- step 550 If, following step 550, a cyclic noise pattern is uncovered therein, that cyclic noise pattern can be stored in step 560.
- step 570 If, however, a cyclic noise pattern was not uncovered, then the recording device 310 is operated in step 570 for a period of time longer than it was previously, and the recording stored. Thereafter, step 550 is repeated, using the larger number of recorded frames to uncover the cyclic noise pattern. Steps 560, 570, and 550 then repeat until a cyclic noise pattern is found.
- the cyclic noise pattern can be used to remove the noise from the recorded data, and thus obtain a better representation of that which is being detected, such as the image if the device 310 is a camera.
- an initialization period corresponding to the previously determined heat-up period is preferably allowed to occur, so that the device 310 enters a steady state operation.
- recording of the desired scene can begin, as shown by step 620.
- each recorded frame is input into computer 320, and, as shown by step 630, is synchronized with a corresponding frame from the cyclic noise pattern to remove the cyclic noise therefrom. Accordingly, as shown by step 640, each frame with the cyclic noise removed therefrom is obtained. The frames can then be used as desired in subsequent surveillance operations.
- the present invention exploits the obtained cyclic noise pattern.
- the cyclic noise pattern is preferably detected within each frame and eliminated or minimized.
- watermarking of particular frames generated by a source recording device 310 is performed using the noise signature.
- the camera noise is not removed for every nth frame to obtain a detectable watermark indicative that the frame actually comes from that particular source recording device 310. If a different source recording device 310' were instead used, a different noise pattern would exist, and the expected noise pattern would not be found.
- this noise creates a digital signature that will identify the frame as having come from the particular recording device 310 rather than from a different recording device 310', thus foiling any attempts to introduce a substitute stream of data.
- the cyclic noise pattern in order to be able to later in time verify the specific camera that recorded a specific sequence, when storing the specific sequence, it is preferable that the cyclic noise pattern also be stored with the sequence, to ensure that such verification can be made later in time.
- cyclic noise removal of the present invention is described in temis of real-time elimination of the cyclic noise pattern, that the cyclic noise removal can operate upon data that has been previously stored. And while having the actual recording device used to record the data is desirable, noise patterns can be detected in stored data even without having the actual camera.
- the amount of data that is recorded by the video surveillance system 300 depends on the environment that is being monitored. For many environments, there are often areas that require monitoring for activity over an extended period but that do not exhibit a great deal of activity over the course of the extended period. For example, the camera 310 might be focused on the door to a bank vault for 24 hours a day, but might only capture relatively few individuals entering the vault or merely walking by the vault door. This can easily be contrasted with the case of frames from a motion picture or from a video camera that is trained on a busy area with much traffic.
- Exemplary aspects of the present invention exploit monitoring of environments that do not exhibit a great deal of activity over the course of an extended period of monitoring. Rather than storing all of the surveillance data recorded, another aspect of the present invention reduces the amount of storage by reducing the stored image resolution for frames of data corresponding to no motion being detected.
- Frames of digital image data are typically made up of pixels, with each pixel having, for example, a 16, 24, or 32 bit RGB representation. Since the resolution of a particular frame in increases as the number of pixels used to represent the frame increases, to conserve data storage space that would otherwise be taken up by filming of these environments exhibiting no activity for extended periods, after a predetermined period of time of storing a full-sized frame during which no motion is observed, the resolution of the stored frame is reduced to some fraction, for example, one-quarter, of the size of the full- sized frame. The smaller frame size is used until a frame with motion appears. Then, the stored frame size is increased to a larger frame size. It should be understood that the lower the fraction, the greater the reduction in storage space typically needed to store the data. While lesser or greater than 25% resolution can be stored, this amount has been found to be a good compromise between maintaining clarity of the image and reducing data stored, which, as will be appreciated, are competing requirements.
- FIGS. 4A through 4D illustrate the various operations necessary to implement the reduced resolution frame storage.
- an exemplary frame storage size of 640x480 pixels (prior to any compression taking place) is used, with a reduced resolution frame storage size of 320x240 pixels (prior to any compression taking place) if no differences indicative of motion or activity occurring in the environment or area are monitored.
- the computer device 320 performs a frame by frame comparison for a particular camera of the cameras 310. It is understood that even with cyclic noise patterns removed, differences between images will still result, even if the actual scene recorded was the same.
- differences between frames that exceed a certain predetermined threshold are used to indicate the introduction of motion to a scene.
- a certain predetermined threshold such as 3-5% of tolerated loss
- the predetermined threshold between adjacent frames containing motion will be exceeded because the new object contained in the frame will significantly alter certain bits within the frame.
- the comparison operations operate upon the full resolution frame size, and that the reduced frame size be stored once it is determined that motion between adjacent frames does not exist.
- Whether adjacent frames are within the threshold can be determined using pattern recognition techniques, and preferably the pattern recognition technique described in the U.S. Appln. bearing attorney reference number 042503/0259665 mentioned above.
- the reference frame is initially set to an initial frame of a sequence of frames, while the target frame is initially set to a subsequent frame of the sequence of frames.
- the subsequent frame that was the target frame is redesignated as a new reference frame, and another subsequent frame that follows the subsequent frame is redesignated as a new target frame.
- the process is preferably repeated for each successive frame in the sequence.
- the recording device 310 is fixed in position, does not zoom, and always records the same background scene. Thus, processing can be simplified from the situation where the recording device 310 is not fixed. If not fixed, then a no-motion reference frame 414 cannot be obtained, and a sequential comparison of frames is required. It is noted, however, that since a sequential comparison of frames may already be obtained if compression in addition to the frame size reduction described herein is being used, that comparison can be used rather than using a no-motion reference frame that is always the same.
- a 640x480 reference frame 402 of digital data that has been previously recorded as a 640x480 size frame that captured a scene A is compared with a subsequent 640x480 target frame 404 of digital data.
- this subsequent frame contains a scene B that that is different from scene A, thus indicating that there is activity or motion that occurs that engenders differences between the frames 402, 404 and causes the predetermined threshold to be exceeded. Since the predetermined threshold is exceeded, the scene B is recorded at the larger 640x480 frame size. Subsequent frames 406 continue to be sized at the larger 640x480 frame size until the predetermined threshold is not exceeded for some window of time, typically 200-300 frames of no activity.
- a 640x480 reference frame 408 of digital data representing scene A that has previously been recorded as a 320x480 reduced frame is compared with a 640x480 target frame 410 of digital data, which captures a subsequent scene A that falls within the predetermined threshold. Since subsequent scene A falls within the predetermined threshold, it is also recorded as a reduced 320x480 frame size, indicative of there being no discernible activity or motion that occurs. Preferably, subsequent frames 412 continue to be sized at the smaller 320x240 frame size until differences between frames are recognized that cause the predetermined threshold to be exceeded.
- a 640x480 reference frame 414 of digital data that was recorded at 640x480 of scene A is compared with a subsequent 640x480 target frame 416 of digital data, which captures a subsequent scene A that differs by less than the predetermined threshold. Since initial scene A and subsequent scene A are within the threshold, it is concluded that there is no discernible activity or motion that occurs. Thus, the recorded frame size is thus adjusted to the smaller 320x240 frame size if the window of time as referred to above has elapsed. If the window of time has not elapsed the subsequent scene A is stored as a 640x480 frame, but a counter conesponding to the window of time is incremented. Subsequent frames 418 that also are within the predetermined threshold after the window of time has been exceeded are thus sized at the smaller 320x240 frame size until differences that cause the predetennined threshold to be exceeded are recognized.
- This reference frame is compared with a 640x480 target frame 422 of digital data, which captures a subsequent frame of scene B that differs from scene A by more than the predetermined threshold, indicating that there is activity or motion that occurs that engenders differences between the frames 420, 422. Since the predetermined threshold is exceeded, the subsequent frame size is adjusted to the larger 640x480 frame size.
- subsequent frames 406 are sized at the larger 640x480 frame size until the predetermined threshold is no longer exceeded, and the window of time has elapsed.
- the cyclic noise that is detected can be used for other purposes.
- the cameras, amplifiers, and the like will all be turned on and being used continuously, 24 hours a day. As a result, they tend to operate in a stable manner, and thus the cyclic noise pattern can be eliminated. If, however, the camera, amplifier or another component begins to drift from its stable operating characteristics, a new cyclic noise pattern will develop that is different from the originally obtained cyclic noise pattern. As a result, the watermark that is occasionally passed will differ, as described above. When this occurs, the difference will cause an alert, as noted above.
- the present invention can be used as an early warning system that can indicate that a particular device may soon completely fail. If a particular device is found to be unstable and needs to be replaced, it is noted that the initial set-up as previously described will need to be performed again, since the new device will cause a different cyclic noise pattern to result.
- One of the pattern recognition embodiments describes a method of detecting an occurrence of an event that exists within a sequence of stored frames relating to a scene, as well as methods that operate upon the data relating to the scene once an event is detected.
- the method of detecting occurrence of an event includes comparing a first stored frame to a later stored frame to detennine whether a change in size between the frames exists that is greater than a predetermined threshold.
- the first and later stored frames are compressed, and operated upon while compressed to determine the occunence of an event.
- Methods of operating upon the data relating to the scene one an event is detected include providing a still image at intervals, typically every 5-10 frames using the data from at least the recording device that detected the event.
- still images from other recording devices that are associated in a predetermined manner with the recording device that detected the event are also obtained at intervals.
- FIG. 7 illustrates an exemplary system 700 according to the present invention, which is shown as having a computer 720 that compresses and operates upon digitized data using the features of the present invention described herein.
- Computer 720 may also operate to compress the digitized data and transmit it to another device, shown as a server 730, so that server 730 operates upon digitized data using the features of the present invention described herein. While compression may be achieved by computer 730, practically this is not efficient.
- a number of computers 720 are shown as providing digitized data to server 730, which aspect is illustrated in order to explain further how various related streams of digitized data can be operated upon according to one embodiment of the present invention, as described hereinafter.
- the computers 720 and 730 could be implemented as a network of computers or as a device other than a computer that contains a processor that is able to implement the inventions described herein. Many other variants are possible.
- a device such as mentioned will typically include a processor of some type, such as an Intel Pentium 4 microprocessor or a DSP in conjunction with program instructions that are written based upon the teachings herein, other hardware that implements the present invention can also be used, such as a field programmable gate anay.
- the program instructions are preferably written in C++ or some other computer programming language.
- the present invention operates upon data preferably formatted into a matrix anay within a frame, as described further hereinafter.
- the blocks can be formatted into frames that may or may not have the ability to store the same amount of data.
- each image and its related audio data will preferably have its own frame, although that is not a necessity, since image and audio data can be stored on separate tracks and analyzed independently.
- the computer 720 or server 730 is assumed to have received digital image/audio frames that relate to a sequence, which sequence has been digitized into frames and compressed in some manner. These frames may be compressed in essentially real-time and operated upon, or compressed and transmitted for storage to in another location. Further, there are situations when compression is unnecessary, such as if pattern recognition between frames is performed during substantially real time operations on frames.
- the data obtained conesponds to locations where the scenes being monitored and stored do not change often.
- a camera may be stationed outside a vault or door in a stairwell and record the scene, which is then received by the computer 700. It is only when, someone enters the vault or crosses a door in the stairwell that the image frame changes substantially (since even between sequential frames that record the same scene, changes in the data representing the frame will exist due to at least noise effects).
- the present invention provides a mechanism for detecting that a change has occuned in the field of view without having to decompress image frames and also provides several mechanisms for reacting to the detection of the change.
- FIG. 8 illustrates compressed frames that have been produced by a camera that has had an object pass through its field of view.
- Camera 802 is pointed at doorway 806 in a stairwell and a computer (not shown) attached to camera 802 such as computers 720 of FIG. 7 produces a sequence of digitized and compressed frames 808a-n, 810a-n, 812a-n, and 814a-n. Compression can be achieved using the techniques described in U.S. Patent Application bearing attorney reference 042503/0259665 entitled "Method And Apparatus For Determining Patterns Within Adjacent Blocks of Data" filed on October 31, 2001 and assigned to the assignee of the present application, and Appln. No.
- the frames 808a-n, 81 Oa-n, 812a-n, and 814a-n are operated upon in essentially real-time, stored and operated upon, compressed and then operated upon, compressed, stored and then operated upon, or compressed, transmitted, stored at another location and then operated upon, the inventions described herein are equally applicable.
- the uncompressed frames will have a certain relatively constant size, whether it has been compressed or not, and whether there is action or movement or not, because the same number of bits is used to represent the frame.
- the compressed frames of 812a-n are larger than the compressed frames 808a-n and 81 Oa-n because person 804 has entered through doorway 806.
- the size of compressed frames return to the previous relatively small size, as shown in frames 814a-n.
- FIG. 9 A illustrates a process 900 for detecting and reacting to a change in the size of compressed frames obtained from the same source.
- present invention is not limited to compressed video frames. Even though above the frames have been described in the context of video data the frames could be audio data or a mixture of audio and video data or even another form of data that provides an indication of the occunence of an event.
- process 900 operates on compressed video frames from the field of view of a camera which has an initial period of substantially no activity in the field of view followed a period of activity caused by an object which enters the field of view causing a change in the size of the compressed video frames.
- Each of the frames is preferably stored in compressed form as a block of data that has a size associated with it.
- process 900 for each frame, it is determined at step 920 whether there are more following frames to process. If not, then process 900 stops at step 922. If there are, for another frame, the size of adjacent compressed frames is compared in step 902. If the subsequent frame is greater 904 than the previous frame by a certain threshold, a react step 905 follows to indicate detection of the occunence of the event.
- FIG. 9B illustrates several possible examples of operations that can be performed in reaction to the detection of an occunence of an event according to an embodiment of the present invention. Specifically, as shown in FIG. 9B one or more of three possible operations 906, 910, 916 can be performed as a result of an event occurring, as determined by step 905.
- operation 906 can be instituted, which will cause a sequence of still images, obtained at some interval from each other that follow the initiation of the event to be obtained.
- These still frames represent a significant reduction from the total number of frames that a particular camera has obtained.
- one of the still images obtained will contain a "best view" of the object that has caused the increase in size of the compressed frame.
- operations can be performed, such as transmitting, via email or other transmission mechanism shown in step 908, each still image to a predetermined location.
- image recognition can be performed on each still image, as shown by step 909 indicating that the still frame should be processed for external pattern recognition of an object, such as a person or article, as performed in and described by step 910, detailed hereinafter.
- step 910 external pattern recognition is directed to looking for previously obtained external patterns within either compressed or uncompressed data representing frames.
- This pattern recognition is shown as being initiated by step 905 after a compressed frame has been acted upon, which is prefened for previously compressed data, since external pattern recognition need not be performed on frames that have roughly the same size that indicates no motion is taking place.
- external pattern recognition can occur on the uncompressed data that is being searched for to determine if external patterns of significance exist therein, using techniques such as described in U.S.
- the external pattern is used to obtain search blocks that are searched for in the target frame.
- other conventional pattern recognition techniques can be used.
- the external patterns of interest are contained in a table of preferably both uncompressed and compressed files, and which of the files being used will depend upon whether pattern recognition will be made based upon uncompressed data or compressed data, respectively.
- the compressed objects of interest are stored using the same compression technique that is used to obtain compression of the frames, thus allowing for more efficient pattern recognition.
- a match indication will cause an alert of some type to be generated in step 914. This can occur at either computer 720 or server 730.
- An alert can be for example, an indication on the monitor of a security guard or other authority indicating the identity of the person identified by the external pattern recognition or the location of the event, or it could be an audible alert over a wireless radio to a security guard to confront someone with a certain description at a certain location. An example of an alert will be described in connection with FIG. 10 below.
- process 900 allows a security guard observing the field of view of the camera on a monitor to tag or mark an object/person that caused the change in the size of the frames so as to permit easy following of the person as the person navigates in front of the camera and appears on the monitor, as shown by step 916.
- Different shapes and colors of tags can be used to differentiate between different levels of scrutiny that should be applied to each object. For example, one shaped object, such as a triangle, can be used to designate an external pattern that has a high priority, whereas another shaped object, such as a circle, can be used to designate an external pattern that has a low priority. Similarly, or in combination with the shapes being used, one color, such as red, can be used to designate a different high priority, and another color, such as green, can be used to designate a different low priority.
- FIG. 10 illustrates in greater detail a technique for facilitating following an image of an object, such as a person, among a group of objects displayed on a monitor by marking the image of the object on the monitor with a tag or mark.
- FIG. 10 illustrates four screen displays 1002, 1004, 1006, 1008 in which an object caused a change in the size of frames to occur.
- a tag 1001a is attached to the image of person 1001.
- tag 1001a facilitates observance of where the image of person 1001 is on the display.
- Displays 1002, 1004, 1006, and 1008 also show in a comer of the display an alerts region.
- a visual alert is generated when the external pattern recognition process 910 that produces an indication that a match has occurred, and identifies both the match name, in this case a person's name, and the location where the match occurred, thus describing the identity and location of the object identified.
- FIG. 12 shows a video surveillance system according to an embodiment of the present invention. This system is similar to the one shown in FIG. 11C, with the exception that in addition to controlling the positions of the cameras and supplying the camera signals to the monitor, the controller also manages information in the digital storage device.
- the disk drive stores a table listing all of the data units, e.g., files, stored thereon, the size of each file, its date of creation, its date of last access, and the sector (or other unit as appropriate) at which storage of the data unit begins.
- Each segment of the data unit includes a link to the next sector of the data unit. Possibly, it also includes a link back to the previous sector.
- the final sector of the data unit points to a null value as the next sector.
- link-backs are included, the first sector's link-back similarly points to a null value.
- One embodiment of the present invention scores individual data units based on their priority and age, and chooses data units for erasure in the order: low priority, old data; low priority, new data; high priority, new data; high priority, old data; low priority.
- the controller can construct a score for the data unit as follows:
- the controller can then, based on the file sizes associated with the images, select enough low-scoring data units for erasure so that there will be enough room for the new data.
- the controller can then instruct the hard disk unit to erase the selected files and store the new data therein.
- Age parameter representing the age of creation of an image file as above, it could alternatively represent a time since the last access of the image.
- creation age and access age could be used. Additionally, other parameters could also be used. For example, a score such as
- Subject could be 1 for the Vault and 0 for Stairwell, with Priority being 1 for high priority and 0 for low priority, and Age being 0 for old through 255 for new.
- This scoring system would value images from Vault cameras more highly than images from the Stairwell.
- the data units subject to potential erasure need not be limited to those already stored but may additionally include the unit intended to be stored. In this case, the new data unit may be designated for erasure - in which case, no erasure of stored information would be necessary.
- the system may make use of parameters with more than two values.
- the Priority parameter may have values for high, medium and low or a range such as 0-10, with 0 being the highest priority and 10 being the lowest priority.
- FIG. 13 shows a video surveillance system according to an embodiment of the present invention.
- This system is similar to the one shown in FIG. 11C, with the exception that in addition to controlling the positions of the cameras and supplying the camera signals to the monitor, the controller also monitors images produced by the cameras for certain conditions as specified by rales set by the video surveillance system operator, and produces alerts, also called alarms, or the like when one of those conditions is met.
- the monitoring program need not be in the controller, but may be separate and monitor images in the digital storage device after storage.
- the base of the monitoring program lies in its pattern recognition of image features.
- pattern recognition as used herein is capable of identifying people based on a shot of their face in an image, etc.
- the pattern recognition system can also resolve objects, such as purses, briefcases, individual cards, betting chips. The degree of resolution, of course depends upon many factors, as is known. All such things that might be the object of pattern recognition will sometimes be refened to as entities in the following discussion and claims.
- Pattern recognition can be based on a single image, e.g., "If the custodian is in the vault shot, notify the system operator", or it can be based on multiple images, e.g., "If John Doe and Joe Smith (two suspected bank robbers) are in the lobby shot at time TI and only John Doe is in the lobby shot at a later time T2, then notify the system operator and start looking for Joe Smith.”
- the monitor program may have a rule such as
- the rule may be a disjunctive one such as
- An example of this rule is at various airport security check-in locations. At an initial entry position, a person A is photographed canying no objects. At another location, such as an entryway onto an airplane, another photograph shows person A with an object B, which can be used to generate an alarm showing a changed condition.
- a modification of the rule also provides for the inclusion of alternative or alias information concerning a specific person or object.
- the group information can also include alternative or alias information.
- This pattern recognition process may be done on images in the video surveillance system a single time. Alternatively, it may be done periodically, or on a continuous basis. Further, the rules can have time limits. For example, a rule may specify that if a person A is recognized in an image, the system will search for a person B in images for 15 minutes therefrom and, if person B is found within that time, a certain action will be taken.
- FIG. 14 illustrates data gathering system according to one embodiment of the present invention.
- System 1400 includes player place settings 1410a-g, dealer setting 1412, camera 1414, computer 1415, network 1416 and terminal 1417.
- Camera 1414 films table 1400 and player place settings 1410a-g and dealer setting 1412 to obtain a stream of digital data that includes the repetitive actions that occur.
- the repetitive actions are activities that occur in the place settings 1410a-g and dealer setting 1412.
- the camera 1414 is preferably fixed, and is preferably set at a same zoom position for all comparison operations performed as described herein, so that as much consistency between adjacent frames in the stream of digital data as possible are obtained.
- a player place setting 1410a-g has bet area 1502 and play area 1504.
- a player will place bets such as chips or jetons in bet area 1502 and cards of the player's hand in play area 1504.
- activity takes place in play area 1504 and possibly in bet area 1502.
- the dealer's hand will be placed in dealer's hand area 1506.
- Figure 16 illustrates a sequence of repetitive actions that are possible in a game played in accordance with an embodiment of the present invention.
- each of player's place setting 1410a-g are clear of any cards and bets
- the dealer's setting 1412 are clear of any cards.
- the sequence of repetitive actions 1602a-312a are representative of what happens at one of the player place settings 1410a-g. A sequence similar to that shown in Figure 16 can occur for other player place settings.
- FIG. 15A illustrates a mask for a gaming table according to one embodiment of the present invention.
- Mask 1500 includes masks for player place settings 1510a-g and mask for dealer place setting 1512.
- Computer 1415 stores mask 1500 and uses it to detect transitions between hands.
- FIG. 17 illustrates the masks for player place settings and the dealer place setting in greater detail.
- Mask 1702 is for player place setting 1510a-g in which no cards and bets are present, and a mask 1704 is for a dealer setting 1512 in which no cards are present.
- the above-described pattern comparisons require pattern matching operations to be performed between the mask 1702 and that portion of the digital data stream conesponding to the location of the mask 1702 during the playing of the game of chance.
- the mask 1702 in such comparison operations, is essentially an external pattern that is being searched for in a particular location of each frame of the stream of digital data representing the image.
- Conventional pattern recognition systems can be used to operate upon the stream of digital data and obtain the indications of the mask 1702 being within the stream of digital data that is obtained.
- the mask area can be further required to at least have recognized within it an object of significance to it, such as a card or a chip, in order to prevent an enant object, such as a hand, that appears in the mask area from inconectly indicating that a game is underway or has been completed.
- an object of significance to it such as a card or a chip
- Comparisons between frames can also be made, such that continued durations of an activity can generate a count. For instance, white space on a dealer card area that exists for greater than a predetermined period of time could be used to generate a count, with another count not being generated until after that dealer card area has had cards placed thereon for another predetermined period of time.
- FIG. 18A illustrates a roulette layout.
- Layout 1800 is divided into 180 areas for placing bets.
- the fundamental area of layout 1800 is the alternating area of red and black numbers 1-36 and digits 0 and 00 that are colored green.
- the remaining areas are permutations of the fundamental area: areas for even numbers, odd numbers, red numbers, black numbers, first 12 numbers, second 12 numbers, third 12 numbers, first 18 numbers, and last 18 numbers.
- Each of the one to six players at the roulette table is given different- colored chips so that keeping track of the numbers on the layout one is betting on is possible using a reference to the color.
- FIG. 18B illustrates a roulette wheel. Wheel 1810 is divided into 38 slots, 512 for a ball to land in, and is numbered 1 through 36, 0 and 00. Each roulette game begins when the dealer spins the wheel in one direction, and then rolls a small ball along the inner edge 1814 of wheel 1810 in the opposite direction. The ball eventually falls into one of the numbered slots 1812. That number is the declared winner for that game.
- FIG. 18C illustrates a mask 1820 for a roulette wheel, which can be as simple as tracking the slot area 1812 that the ball rolls into. Mask 1820 is stored in a computer such as computer 1415 of FIG. 14 and is used to detect the transitions between roulette games.
- a camera such as camera 1414 is placed to view the wheel 1810 and is used to capture the repetitive actions of the roulette wheel and ball.
- a camera such as camera 1414 is placed to view the wheel 1810 and is used to capture the repetitive actions of the roulette wheel and ball.
- each time the ball rolls into a slot this indicates that the game is complete, and can be recorded as a repetitive sequence.
- That camera, or another camera can also be used to capture the repetitive action of chips being played on the table, with each of the separate betting areas having its own mask area, which can be queried for repetitive activity using the techniques described above.
- the actions of chips being taken away from losing bets by the dealer, and other chips being provided to the winner from the dealer are repetitive activities that can be used to count the number of games that take place in a given period of time.
- FIG. 19 illustrates a sequence of repetitive actions for a roulette wheel and ball.
- a computer such as computer 1415
- mask 1820 By comparing at a computer, such as computer 1415, mask 1820 to the repetitive actions 1902-1908 it can be detennined that 2 games have been played. This is known in the sequence of four frames (with other frames in between not shown), since when the ball comes to rest on any slot 1812 can be used as an indication that a game has been completed, which action is shown by actions 1902 and 1908.
- each time the ball appears in the inner edge 1814 of wheel 1810 can be used to indicate that a new game is occurring.
- the efficiency of the roulette dealer can be tracked.
- tracking both the mask 1820 and the mask associated with each separate betting area it can be determined that the declared wim er at the table conesponds to the actual winner as determined by which numbered slot 1812 the ball actually fell into.
- the present invention can be adapted for other repetitive games, such as poker, 3-card poker, pai-gow, Caribbean stud, baccarat, and other games.
- reports can be generated based upon the statistics obtained by the present invention.
- the number of hands dealt in the period can be obtained.
- that dealer's average efficiency can be determined.
- statistics can be kept for a table location basis, for example, so that it can be determined which tables are busiest during various periods of time, which can then allow, again for example, staffing of the busiest tables with the most efficient dealers.
- FIGS. 20A and 20B illustrate two different reports, directed to a dealer and a table location, respectively, illustrating the statistics obtained over a single shift of a predetermined duration. Added security also is obtained, since verification that payouts were made to actual winners can occur.
- cameras in hallways can be used to keep track of the period of time that a laundry cart is in front of a specific room, using a mask that contains a picture of the room without a cart in front.
- the object can be interpreted to be the cart.
- the period of time until that object is removed from the scene can be used to monitor the amount of time the cart was in front of the room, and therefore obtain an estimate of the time that was needed to clean the room.
- the repetitive action of making money payouts by a dealer can be used to count the amount of money paid out. Since typically a camera is above a table, a perspective view of the rack that contains the chips that are used for payouts cannot be obtained. Since, however, it is typical to place a silver coin between every five chips, Each time a silver coin seen in an area conesponding to a particular column of chips being paid out appears can be used to estimate that five chips times the value of those chips has been paid out. Thus, counting the instances of recognizing that silver coin in an area conesponding to that column of chips allows a total estimate of an amount paid out to be obtained. Thus, the repetitive action is looking for the instances that silver appears in a mask area conesponding to that column of chips. [000188] Of course, other repetitive activities can also be momtored automatically using the techniques described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
Abstract
L'invention concerne un système de surveillance répartie permettant le stockage numérique de données, ainsi que la reconnaissance de structures externes obtenues dans des opérations parallèles. Les données numériques peuvent être stockées selon différents niveaux de compression et la reconnaissance de structure réalisée alors que les données numériques sont toujours sous leur forme compressée. La taille d'une trame numérique est réduite lorsqu'on n'observe pas de déplacement à travers les trames séquentielles, minimisant de cette façon le stockage. L'invention concerne aussi un procédé de détection de la présence d'un évènement consistant à comparer une première taille de trame compressée à une taille de trame ultérieure compressée afin de déterminer l'apparition d'un évènement. Un procédé de surveillance d'un jeu de chance consiste à faire fonctionner une caméra vidéo afin d'obtenir un flux de données comprenant plusieurs actions répétitives stockées dedans, se rapportant au jeu de chance, et à analyser automatiquement le flux de données afin de compter le nombre d'actions répétitives, le compte obtenu étant un indicateur utilisable pour surveiller le jeu de chance.
Applications Claiming Priority (13)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US991487 | 1997-12-16 | ||
US09/991,531 US20030096643A1 (en) | 2001-11-21 | 2001-11-21 | Data gathering for games of chance |
US09/991,487 US7058771B2 (en) | 2001-11-21 | 2001-11-21 | System and method for managing memory in a surveillance system |
US990868 | 2001-11-21 | ||
US991528 | 2001-11-21 | ||
US09/991,528 US20030095180A1 (en) | 2001-11-21 | 2001-11-21 | Method and system for size adaptation and storage minimization source noise correction, and source watermarking of digital data frames |
US09/990,868 US7006666B2 (en) | 2001-11-21 | 2001-11-21 | Method and apparatus for detecting and reacting to occurrence of an event |
US09/991,490 US20030095687A1 (en) | 2001-11-21 | 2001-11-21 | System and method for generating alert conditions in a surveillance system |
US991490 | 2001-11-21 | ||
US991531 | 2001-11-21 | ||
US991527 | 2002-02-22 | ||
US09/991,527 US6978047B2 (en) | 2000-11-29 | 2002-02-22 | Method and apparatus for storing digital video content provided from a plurality of cameras |
PCT/US2002/035144 WO2003047258A1 (fr) | 2001-11-21 | 2002-10-31 | Procede et appareil de stockage de contenu video capte par plusieurs cameras |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1457050A1 true EP1457050A1 (fr) | 2004-09-15 |
Family
ID=27560356
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP02803971A Withdrawn EP1457050A1 (fr) | 2001-11-21 | 2002-10-31 | Procede et appareil de stockage de contenu video capte par plusieurs cameras |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1457050A1 (fr) |
JP (1) | JP2005534205A (fr) |
AU (1) | AU2002365345A1 (fr) |
WO (1) | WO2003047258A1 (fr) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US7868912B2 (en) * | 2000-10-24 | 2011-01-11 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US8564661B2 (en) | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
JP2005122359A (ja) * | 2003-10-15 | 2005-05-12 | Sanyo Electric Co Ltd | コンテンツ処理装置 |
US7672370B1 (en) * | 2004-03-16 | 2010-03-02 | 3Vr Security, Inc. | Deep frame analysis of multiple video streams in a pipeline architecture |
US7663661B2 (en) * | 2004-03-16 | 2010-02-16 | 3Vr Security, Inc. | Feed-customized processing of multiple video streams in a pipeline architecture |
US7664183B2 (en) * | 2004-03-16 | 2010-02-16 | 3Vr Security, Inc. | Correlation processing among multiple analyzers of video streams at stages of a pipeline architecture |
US7697026B2 (en) * | 2004-03-16 | 2010-04-13 | 3Vr Security, Inc. | Pipeline architecture for analyzing multiple video streams |
US7646895B2 (en) | 2005-04-05 | 2010-01-12 | 3Vr Security, Inc. | Grouping items in video stream images into events |
US8130285B2 (en) | 2005-04-05 | 2012-03-06 | 3Vr Security, Inc. | Automated searching for probable matches in a video surveillance system |
DE102005019153A1 (de) * | 2005-04-25 | 2007-06-06 | Robert Bosch Gmbh | Verfahren und System zum Verarbeiten von Daten |
GB2457194A (en) * | 2006-11-08 | 2009-08-12 | Cryptometrics Inc | System and method for parallel image processing |
ITAV20070003U1 (it) * | 2007-09-14 | 2007-12-14 | Silvio Spiniello | Modello di utilita' dal nome "secutity" per la prevenzione,la repressione,la sicurezza,la ricerca di persone scomparse e la ricostruzione storica dei fatti realmente accaduti.il tutto con l'utilizzo della mia invenzione che ci permettera' di poter ce |
US8115623B1 (en) | 2011-03-28 | 2012-02-14 | Robert M Green | Method and system for hand basket theft detection |
US8094026B1 (en) | 2011-05-02 | 2012-01-10 | Robert M Green | Organized retail crime detection security system and method |
JPWO2014091667A1 (ja) | 2012-12-10 | 2017-01-05 | 日本電気株式会社 | 解析制御システム |
WO2014097530A1 (fr) * | 2012-12-19 | 2014-06-26 | 日本電気株式会社 | Système de commande d'un processus d'analyse |
JP5500303B1 (ja) | 2013-10-08 | 2014-05-21 | オムロン株式会社 | 監視システム、監視方法、監視プログラム、ならびに該プログラムを記録した記録媒体 |
CN106791586A (zh) * | 2015-11-19 | 2017-05-31 | 杭州海康威视数字技术股份有限公司 | 一种对移动目标进行监控的方法及监控设备、装置、系统 |
CN111836102B (zh) * | 2019-04-23 | 2023-03-24 | 杭州海康威视数字技术股份有限公司 | 视频帧的分析方法和装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2596601A1 (fr) * | 1986-03-31 | 1987-10-02 | Nippon Denki Home Electronics | Appareil cyclique de reduction du bruit |
US5901246A (en) * | 1995-06-06 | 1999-05-04 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5724475A (en) * | 1995-05-18 | 1998-03-03 | Kirsten; Jeff P. | Compressed digital video reload and playback system |
AU7497700A (en) * | 1999-09-16 | 2001-04-17 | Matthew D. Nicholson | Casino table automatic video strategic event annotation and retrieval |
-
2002
- 2002-10-31 JP JP2003548542A patent/JP2005534205A/ja active Pending
- 2002-10-31 AU AU2002365345A patent/AU2002365345A1/en not_active Abandoned
- 2002-10-31 EP EP02803971A patent/EP1457050A1/fr not_active Withdrawn
- 2002-10-31 WO PCT/US2002/035144 patent/WO2003047258A1/fr active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO03047258A1 * |
Also Published As
Publication number | Publication date |
---|---|
AU2002365345A1 (en) | 2003-06-10 |
WO2003047258A1 (fr) | 2003-06-05 |
JP2005534205A (ja) | 2005-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003047258A1 (fr) | Procede et appareil de stockage de contenu video capte par plusieurs cameras | |
US6978047B2 (en) | Method and apparatus for storing digital video content provided from a plurality of cameras | |
US7006666B2 (en) | Method and apparatus for detecting and reacting to occurrence of an event | |
AU769980B2 (en) | Casino video security system | |
JP3974038B2 (ja) | 監視・偵察システムにおける軌道分析を用いた侵入者検知 | |
US10535375B2 (en) | Information processing system, information processing method, and recording medium | |
US20030044168A1 (en) | Event image recording system and event image recording method | |
ES2320416T3 (es) | Procedimiento y aparato para reducir falsas alarmas en situaciones de salida/entrada para la vigilancia de seguridad residencial. | |
WO2001006790A1 (fr) | Systeme d'enregistrement video numerique | |
US20110128382A1 (en) | System and methods for gaming data analysis | |
EP1901822A2 (fr) | Systeme de jeu a distance avec jeux sur table en direct | |
JP6248303B2 (ja) | 遊技場システム | |
US20080151049A1 (en) | Gaming surveillance system and method of extracting metadata from multiple synchronized cameras | |
JP2003340124A (ja) | 遊技場における監視装置 | |
JP2002279540A (ja) | 画像監視システム及びネットワークを用いた画像監視システム | |
JP2006006590A (ja) | セキュリティシステム | |
US20030095687A1 (en) | System and method for generating alert conditions in a surveillance system | |
JP4156690B2 (ja) | 遊技場における監視装置 | |
US7058771B2 (en) | System and method for managing memory in a surveillance system | |
JP5050243B2 (ja) | 分散型監視カメラシステム | |
US20020005899A1 (en) | Identification transaction recording system | |
JP3022385B2 (ja) | 防犯機能付き台間玉貸し機 | |
JP2007190076A (ja) | 監視支援システム | |
US20060098880A1 (en) | Method and apparatus for storing digital video content provided from a plurality of cameras | |
JP2006255027A (ja) | 監視システムおよび方法、不正遊技者認識サーバおよび方法、人追跡サーバおよび方法、監視サーバおよび方法、並びにプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20040614 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20080501 |