US7346186B2 - Video and audio content analysis system - Google Patents

Video and audio content analysis system Download PDF

Info

Publication number
US7346186B2
US7346186B2 US10/056,049 US5604902A US7346186B2 US 7346186 B2 US7346186 B2 US 7346186B2 US 5604902 A US5604902 A US 5604902A US 7346186 B2 US7346186 B2 US 7346186B2
Authority
US
United States
Prior art keywords
application
video
processing unit
audio
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US10/056,049
Other versions
US20020110264A1 (en
Inventor
David Sharoni
Hagai Katz
Yehuda Katzman
Doron Girmonski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qognify Ltd
Original Assignee
NICE Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US26472501P priority Critical
Application filed by NICE Ltd filed Critical NICE Ltd
Priority to US10/056,049 priority patent/US7346186B2/en
Assigned to NICE SYSTEMS, LTD. reassignment NICE SYSTEMS, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GIRMONSKI, DORON, KATZ, HAGAI, KATZMAN, YEHUDA, SHARONI, DAVID
Publication of US20020110264A1 publication Critical patent/US20020110264A1/en
Application granted granted Critical
Publication of US7346186B2 publication Critical patent/US7346186B2/en
Assigned to QOGNIFY LTD. reassignment QOGNIFY LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NICE SYSTEMS LTD.
Assigned to MONROE CAPITAL MANAGEMENT ADVISORS, LLC reassignment MONROE CAPITAL MANAGEMENT ADVISORS, LLC SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ON-NET SURVEILLANCE SYSTEMS INC., QOGNIFY LTD.
Application status is Active legal-status Critical
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves

Abstract

The present invention is directed to various methods and systems for analysis and processing of video and audio signals from a plurality of sources in real-time or off-line. According to some embodiments of the present invention, analysis and processing applications are dynamically installed ill the processing units.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority from U.S. provisional Application Ser. No. 60/264,725,filed Jan. 30, 2001,which is incorporated herein by reference in its entirety.

BACKGROUND OF THE INVENTION

The ever-increasing use of video and audio in the military, law enforcement and surveillance fields has resulted in the need for an integrative system that may combine several known detecting and monitoring systems. There are several questions related to real-time and off-line analysis and processing of information regarding the existence and behavior of people and objects in a certain monitored area.

Examples of such typical questions include questions regarding presence and identification of people (e.g. Is there anybody? If so, who is he?), movement (e.g. Is there anything moving?), number of people (e.g. How many people are there?), duration of time (e.g. for how long have they stayed in the area?), identifications of sounds, content of speech, number of articles and the like.

Currently, a dedicated system having a separate infrastructure is usually installed to provide a limited solution to each of the above-mentioned questions. Non-limiting examples of these systems include a video and audio recording system such as NiceVision of Nice Systems Ltd., Ra'anana, Israel, a movement-detecting system such as Vicon8i of Vicon Motion Systems, Lake Forest, Calif., USA and a face-recognition system such as FaceIt system of Visionics Corp., Jersey City, N.J., USA.

The separate infrastructure for each application also limits the area of surveillance. For example, a face recognition system, which is connected to a single dedicated video sensor, can cover only a narrow area. Moreover, the separated applications provide only a limited and partial integration between various monitoring applications.

An integrated monitoring system may enable advanced solutions for combined and conditioned questions. An example of conditioned questions is described below. “If there is a movement, is anyone present? If someone is present, can he be identified? If he can be identified, what is he saying? If he cannot be identified, record the event.”

It would be advantageous to have an integrated monitoring system for analysis and processing of video and audio signal from a plurality of sources in real-time and off-line.

SUMMARY OF THE INVENTION

The present invention is directed to various methods and systems for analysis and processing of video and audio signals from a plurality of sources in real-time or off-line. According to some embodiments of the present invention, analysis and processing applications are dynamically installed in the processing units.

There is thus provided in accordance with some embodiments of the present invention, a system having one or more processing units, each coupled to a video or an audio sensor to receive video or audio data from the sensor, an application bank comprising content-analysis applications, and a control unit to instruct the application bank to install at least one of the applications into at least one of the processing units.

There is further provided in accordance with some embodiments of the present invention, a method comprising installing one or more content-analysis applications from an application bank into one or more video or audio processing units, the applications selected according to predetermined criteria and processing input received from one or more video or audio sensors, each coupled to a respective one of the video or audio processing units according to at least one of the installed applications.

BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:

FIG. 1 is a block diagram illustration of a video and audio content analysis system according to some embodiments of the present invention;

FIG. 2 is a block diagram illustration of a distributed video and audio content analysis system according to some embodiments of the present invention;

FIG. 3 is a flow chart diagram of the operation of the system of FIGS. 1 and 2 according to some embodiments of the present invention; and

FIGS. 4A and 4B are block diagram illustrations of the video-processing unit of FIG. 1 and FIG. 2 according to some embodiments of the present invention;

It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.

DETAILED DESCRIPTION OF THE PRESENT INVENTION

In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.

Reference is now made to FIG. 1, which is a block diagram illustration of a video and audio content analysis system 10 according to some embodiments of the present invention. System 10 may be coupled to a surveillance system having a video and audio logging and retrieval unit such as NiceVision of Nice Systems Ltd, Ra'anana, Israel.

System 10 may comprise a plurality of video sensors 12 and a plurality of audio sensors 14. Video sensor 12 may output an analog video signal or a digital video signal. The digital signals may be in the form of data packages over Internet Protocol (IP) as their upper layer and may be transmitted over digital subscriber line (DSL), asymmetric DSL (ADSL), asynchronous transfer mode (ATM) and frame relay (FR).

Audio sensor 14 may output an analog audio signal or a digital audio signal. The digital signals may be in the form of data packages over a network, for example, an IP network, an ATM network or a FR network.

System 10 may further comprise a plurality of video-processing units 16 able to receive signals from video sensors 12 and a plurality of audio-processing units 18 able to receive signals from audio sensors 14. Video-processing units 16 may be coupled to video sensors 12 and may be located in the proximity of sensors 12 or may be located remote from sensors 12. Alternatively, video-processing units 16 may be embedded in video sensors 12. Audio-processing units 18 may be coupled to audio sensors 14 and may be located in the proximity of sensors 14 or may be located remote from sensors 14. Alternatively, audio-processing units 18 may be embedded in audio sensors 14. Video-processing unit 16 and audio-processing unit 18 may be a single integral unit.

Other types of sensors and their associated processing units may be added to system 10. Non-limiting examples of additional sensors are smoke sensors, fire sensors, motion detectors, sound detectors, presence sensors, movement sensors, volume sensors, and glass breakage sensors.

System 10 may further comprise an application bank 24 coupled to processing units 16 and 18. Application bank 24 may comprise a plurality of various content analysis applications based on video and/or audio signals processing. For example, application 25 may be a video motion-detecting application, application 26 may be a video based people-counting application, application 28 may be a face-recognition application, and application 29 may be a voice-recognition application. Additional applications may be added to application bank 24. Non-limiting examples of additional applications include conversion of speech to text, compressing the video and/or audio signal and the like.

System 10 may further comprise a database 30 and a storage media 32. Storage media 32 may receive data from processing units 16 and 18 and to store video and audio input. Non-limiting examples of storage media 32 include a computer's memory, a hard disk, a digital audio-tape, a digital video disk (DVD), an advanced intelligent tape (AIT), digital linear tape (DLT), linear tape-open (LTO), JBOD, RAID, NAS, SAN and ISCSI. Database 30 may store time, date, and other annotations relating to specific segments of recorded audio and video input. For example, an input channel associated with the sensor from which the input was received and the location of the stored input in storage 32. The type of trigger for recording, manual or scheduled, may likewise be stored in database 30. Alternatively, the segments of recorded audio and video, preferably compressed may be also stored in database 30.

System 10 may further comprise a control unit 20 able to control any of elements 16, 18 and 24. At least one set of internal rules may be installed in control unit 20. Non-limiting examples of a set of rules include a set of installation rules, a set of recording rules, a set of alert rules, a set of post-alert action rules, and a set of authorization rules.

The set of installation rules may determine the criteria for installing applications in the processing units. The set of recording rules may determine the criteria for recording audio and video data. The set of alert rules may determine the criteria for sending alert notifications from the processing units to the control unit. The set of post-alert action rules may determine the criteria for activating or deactivating applications installed in a processing unit and the criteria for re-installing applications in the processing units.

Control unit 20 may commend application bank 24 to install various applications in processing units 16 and 18 as required by the internal rules installed in control unit 20. The installation may vary among various processing units. For example, in one video-processing unit 16, application bank 24 may install motion detection application 25 and people-counting application 26. In another video-processing unit 16, application bank 24 may install motion detection application 25 and face recognition application 28.

The installation may be altered from time to time according to instructions from a time-based scheduler (not shown) installed in control unit 20 or manually triggered by an operator as will be explained below.

System 10 may further comprise at least one client computer 40 having a display and at least one speaker (not shown) and at least one printer 42. Client computer 40 and printer 42 may be coupled to database 30, storage 32, control unit 20, and application bank 24, either by direct connection or via a network 44. Network 44 may be a local area network (LAN) or a wide area network (WAN).

The operators of system 10 may control it via client computers 40. Client computer 40 may request playing a real-time stream of video and/or audio data. Alternatively, client 40 may request playback of video and audio data stored at database 30 and/or storage 32. The playback may comprise synchronized or unsynchronized recorded data of multiple audio and/or video channels. The video may be played on the client's display and the audio may be played via the client's speakers.

Client 40 may also edit the received data and may execute off-line investigation. The term “off-line investigations” refers to the following mode of operation. Client 40 may request playback of certain video and/or audio data stored in storage 30. Client 40 may also command application bank 24 to download at least one of the applications to client 40. After receiving the application and the video and/or audio files, the application may be executed by client 40 off-line. The off-line investigation may be executed even when the specific application was not installed or enabled on the processing unit 16 or 18 coupled to the sensor 12 or 14 from which the video or audio data were recorded.

Each operator may have personal authorization to perform certain operations according to a predefined set of authorization rules installed in control unit 20. Some operators may have authorization to alter via client 40 at least certain of the internal rules installed in control unit 20. Such alteration may include immediate activation or de-activation of an application in one of processing units 18 and 16.

Client 40 may also send queries to database 30. An example of a query may be: “Which video sensors detected movement between 8:00 AM and 11:00 AM?” Client 40 may also request sending reports to printer 42.

Reference is now made to FIG. 2, which is a block diagram illustration of a video and audio content analysis system 11 according to some embodiments of the present invention. System 11 is a distributed version of system 10 of FIG. 1 and elements in common may have the same numeral references. In these embodiments, video sensors 12, which may be coupled to video processing units 16 and audio sensors 14, which may be coupled to audio processing units 18 may be located at least two remote and separate sites.

Processing units 16 and 18 may be coupled to all the other elements (e.g. database 30, storage 32, control unit 20 and application bank 24 as well as clients 40) of system 11 via network 44. Application bank 24, control unit 20, database 30 and storage 32 may be coupled to each other via network 44, which may include several networks. However, it should be understood that the scope of the present invention is not limited to such a system and system 10 may be only partially distributed.

Reference is now made to FIG. 3, which is a simplified flowchart illustration of the operation of the video and audio content analysis system of FIGS. 1 and 2, according to some embodiments of the present invention. In the method of FIG. 3, control unit 20 may command application bank 24 to install various applications in processing units 16 and 18 (step 100). Different applications may be installed in different units. Processing units 16 and 18 may then receive video and audio signals from video and audio sensors 12 and 14, respectively (step 102). If the signals are analog signals, processing units 16 and 18 may convert the analog signals to digital signals.

Processing units 16 and 18, then, may execute the applications installed in each unit (step 104). The audio and video signals may be compressed and stored in storage media 32 according to a predefined set of recording rules installed in control unit 20 (step 106).

Processing units 16 and 18 may also output indexing-data to be stored in database 30 (step 108). Non-limiting examples of indexing data may include the time of recording, time occurrence of matching a voice or face and the time of counting. Other non-limiting examples may include a video channel number, an audio channel number, results of a people-counting application (e.g. number of people), an identifier of the recognized voice or the recognized face and direction of movement detected by a motion detection application.

Processing unit 16 or 18 may alert control unit 20 when one of the applications installed in it detects a condition corresponding to one of the predefined alert rules (step 110). An example of an alert-rule may be the detection of more than a predefined number of people in a zone covered by one of video sensors 12. Another example of an alert-rule may be the detection of a movement of an object larger than a predefined size from the right side to the left side of a zone covered by one of the sensors. Yet another example may be the detection of a particular face or a particular voice.

Each alert sent by one of processing units 16 or 18 to control unit 10, may also be stored in database 30. The data stored may contain details about the alert such as the time of occurrence, the identifier of the sensor coupled to the processing unit providing the alert and the like.

Upon receiving an alert, control unit 20 may send a message to at least one of clients 40 notifying about the alert. Additionally or alteratively, control unit 20 may command application bank 24 to alter the applications installed in some of the processing units 16 and/or 18. Alternatively, control unit may directly command processing units 16 and/or 18 to activate or deactivate any application installed in the units (step 112). The new commands may be set according to predefined post-alert action-rules installed in control unit 20.

A non-limiting example of a post-alert action-rule may be: If one of video sensors 12 detects a movement, install face recognition application 28 in the processing unit 16, which is coupled to that sensor. Another example of a post-alert action-rule may be: If a particular person is identified by one of processing units 16, activate the compression application and record the video signal of the sensor 12 coupled to that processing unit. A third example may be: If one of audio sensors 14 identifies the voice of a particular person, install face recognition application to a specific processing unit 16 coupled to video sensor 12 and start compression and recording of the video signal of that sensor.

The internal rules of control unit 20 may include the alteration of at least certain of the internal rules according to a time-based scheduler (not shown) stored in control unit 20.

Reference is now made to FIGS. 4A and 4F, which are block diagrams of video-processing unit 16 of FIG. 1 according to some embodiments of the present invention. For clarity, FIGS. 4A and 4B and the description given hereinbelow refer only to video-processing units. However, it will be appreciated by persons skilled in the art that audio-processing units 18 may have similar structure.

Video-processing unit 16A may comprise an analog to digital (A/D) video signal converter 50 as illustrated in FIG. 4A. A/D video converter 50 may receive analog video signals from one of video sensors 12 and to convert the analog signals into digital video signals.

Alternatively, video-processing unit 16B may comprise an Internet protocol (IP) to digital video signal converter 51 as illustrated in FIG. 4B. Converter 51 may receive video signal over IP protocol from one of video sensors 12 and to extract video signals from the IP protocol.

Video-processing unit 16 may further comprise a processing module 52, an internal control unit 54, and a communication unit 56. Internal control unit 54 may receive applications from application bank 24 and may install the applications in processing module 52. Internal control unit 54 may further receive commands from control unit 20 and to alert control unit 20 when a condition corresponding to a rule is detected.

Processing module 52 may be a digital processor able to execute the applications installed by application bank 24. More than one application may be installed in video-processing unit 16. Processing unit 16 may further compress the audio and video signal and to transfer the compressed data to storage media 32 via communication unit 56. Processing module 52 may further transfer indexing data and the results of the applications to database 30 via communication unit 56. Non-limiting examples of communication unit 56 include a software interface, CTI interface, and an IP modem.

The following examples are now given, though by way of illustration only, to show certain aspects of some embodiments of the present invention without limiting its scope.

EXAMPLE I

An operator commands control unit 20 via client 40:

  • Install in all video-processing units a video compression application.
  • Install at 08:00,in video-processing units coupled to video sensors #V1-#V2 a face-recognition application and at 18:00 a motion detection application.
  • Install in video-processing units coupled to video sensors #V11-#V16 a people-counting application.
  • Install in video-processing units coupled to video sensors #V17-#V20 a motion detection application.
  • Record for one minute the compressed video data received from any processing unit if a motion is detected or if the face-recognition application fails to identify a face.
  • If more than 20 people are detected by video sensors #V11-#V16, compress the video data until the number of people is less than 20.
  • If a movement is detected by more than 30 video sensors within an hour, install people-counting application in video-processing units coupled to video sensors #V21-#V30.
EXAMPLE II

Mr. X has to be located immediately. An authorized operator commands control unit 20 via client 40 to add at least one rule regarding Mr. X.

  • Install in all video-processing units a face-recognition application.
  • Install in all audio-processing units a voice-recognition application.
  • Notify control unit when Mr. X is located.
EXAMPLE III-Off Line Investigation

Calculating the number of people in the lobby at 08:00-08:30 and at 17:00-17:30, Monday to Friday.

  • An operator downloads a people-counting application to client 40.
  • The operator requests playback of recorded video data from the video sensor installed in the lobby according to the required times.
  • Client 40 executes the application and send a report to its display and/or printer 42.

While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those of ordinary skill in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

Claims (23)

1. A system comprising:
two or more processing units, each coupled to a respective video or audio sensor, each of which receives video or audio data from said sensor and processes said video or audio data according to one or more content-analysis applications installed therein;
an application bank external to said processing units, said application bank comprising said content-analysis applications; and
a control unit having installed therein a set of post-alert action rules, the control unit coupled to said processing units and to said application bank, said control unit automatically, without user input, and dynamically instructs said application bank to install in real-time, alter receiving an alert from one or more of said processing units, at least another one of said content-analysis applications into at least one of said processing units based on at least one of said post-alert action rules;
wherein said processing unit is able to notify said control unit when one of said applications installed in said processing unit detects a predefined condition associated with at least a portion of said audio or video data;
wherein said control unit is able to instruct one of said processing units to activate or deactivate one of said content-analysis applications already installed in said processing unit based on an alert received from said processing unit or another one of said processing units; and
providing to a client computer a real-time stream of video data, audio data or a combination thereof according to a predetermined time-based schedule.
2. The system of claim 1, wherein at least one of said content-analysis applications is a video movement-detecting application, a video based people counting application, a face detection and recognition application, a voice detection and recognition application, an object detection application or a recognition and surveillance application.
3. The system. of claim 1, wherein said application bank further comprises at least a conversion of speech to text application or a video compression application.
4. The system of claim 1 further comprising at least one additional processing unit coupled to a sensor, which is a smoke sensor, a fire sensor, a motion detector, a sound detector, a presence sensor, a movement sensor, a volume sensor or a glass breakage sensor.
5. The system of claim 1 further comprising a database to store indexing data associated with said video or audio data.
6. The system of claim 1, wherein said application bank, said control unit and said processing units are all coupled via a local area or a wide area network.
7. The method of claim 1 wherein providing said real-time data comprises providing synchronized video data received from at least two sensors.
8. The system of claim 1, wherein one of said post-alert action rules is to install a face recognition application upon detection of a movement.
9. The system of claim 1, Wherein one of said post-alert action rules is to install a video compression application upon identification of a particular person.
10. The system of claim 1, wherein one of said post-alert action rules is to install a face recognition application upon identification of the voice of a particular person.
11. A system comprising:
an application bank having one or more content-analysis applications;
a processing unit coupled to said application, bank and to a video sensor, the application bank being external to said processing unit, wherein said processing unit receives video data from said sensor, processes said data according at least one of said content-analysis applications installed therein, and sends an alert when a predefined condition associated with at least a portion of said data is detected; and
a control unit having installed therein a set of post-alert action rules, the control unit coupled to said processing unit and to said application bank, said control unit is to instruct automatically, without user input, and dynamically said application bank to install, in real-time after receiving the alert, another one of said content-analysis applications into said processing unit according to at least one of said post-alert action rules;
wherein said processing unit is able to notify said control unit when one of said applications installed in said processing unit detects a predefined condition associated with at least a portion of said audio or video data;
wherein said control unit is able to instruct one of said processing units to activate or deactivate one of said content-analysis applications already installed in said processing unit based on an alert received from said processing unit or another one of said processing units;
providing to a client computer a real-time stream of video data, audio data or a combination thereof according to a predetermined time-based schedule.
12. A method comprising:
storing predefined post-alert action rules in a control unit;
detecting a predefined condition associated with at least a portion of an audio or video data received from a video or audio sensor by processing the data at a processing unit coupled to the sensor according to one or more content-analysis applications;
sending an alert based on the detected predefined condition; and
automatically, without user input, and dynamically in real-time after receiving the alert, instructing to install another content-analysis application into a video or audio processing unit from an application bank external to said processing unit having content-analysis applications based on at least one of said predefined post-alert action rules;
wherein said processing unit is able to notify said control unit when one of said applications installed in said processing unit detects a predefined condition associated with at least a portion of said audio or video data;
wherein said control unit is able to instruct one of said processing units to activate or deactivate one of said content-analysis applications already installed in said processing unit based on an alert received from said processing unit or another one of said processing units;
providing to a client computer a real-time stream of video data, audio data or a combination thereof according to a predetermined time-based schedule.
13. The method of claim 12 further comprising:
recording at least a portion of said data.
14. The method of claim 13 further comprising:
providing to a client computer recorded data upon receiving a request from said client computer.
15. The method of claim 13 further comprising:
down-loading at feast one content-analysis application from said application bank to a client computer;
providing to said client computer recorded data upon receiving a request from said client computer; and
processing said recorded data according to at least one of said installed applications.
16. The method of claim 15 wherein providing said recorded data comprises providing synchronized video data received from at least two sensors.
17. The method of claim 13 further comprising:
storing results of said content-analysis applications.
18. The method of claim 17, wherein the results comprises number of people counted by a people counting application.
19. The method of claim 17, wherein the results comprises an identifier associated with a voice recognized by a voice recognition application.
20. The method of claim 17, wherein the results comprises an identifier associated with a face recognized by a face recognition application.
21. The method of claim 12 further comprising:
providing to a client computer a real-time stream of video data, audio data or a combination thereof upon receiving a request from said client computer.
22. The method of claim 21 wherein providing said real-time data comprises providing synchronized video data received from at least two sensors.
23. A method comprising:
installing one or more content-analysis applications from an application bank into one or more video or audio processing units, the application bank being external to said processing units;
storing predefined post-alert action rules in a control unit;
processing input received from one or more video or audio sensors, each coupled to a respective video or audio processing unit according to at least one of said content-analysis applications;
detecting a predefined condition associated with at least one portion of said input;
sending a notification associated with said condition to a control unit; and
automatically, without user input, and dynamically in real-time after receiving the notification, instructing the application bank to install at least another, one of said content-analysis applications into at least one of said processing units based on at least one of said pro defined post-alert action rules;
wherein said processing unit is able to notify said control unit when one of said applications installed in said processing unit detects a predefined condition associated with at least a portion of said audio or video data;
wherein said control unit is able to instruct one of said processing units to activate or deactivate one of said content-analysis applications already installed in said processing unit based on an alert received from said processing unit or another one of said processing units;
providing to a client computer a real-time stream of video data, audio data or a combination thereof according to a predetermined time-based schedule.
US10/056,049 2001-01-30 2002-01-28 Video and audio content analysis system Active 2023-11-05 US7346186B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US26472501P true 2001-01-30 2001-01-30
US10/056,049 US7346186B2 (en) 2001-01-30 2002-01-28 Video and audio content analysis system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/056,049 US7346186B2 (en) 2001-01-30 2002-01-28 Video and audio content analysis system
US12/025,291 US7532744B2 (en) 2001-01-30 2008-02-04 Video and audio content analysis system
US12/418,675 US7751590B2 (en) 2001-01-30 2009-04-06 Video and audio content analysis system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/025,291 Continuation US7532744B2 (en) 2001-01-30 2008-02-04 Video and audio content analysis system

Publications (2)

Publication Number Publication Date
US20020110264A1 US20020110264A1 (en) 2002-08-15
US7346186B2 true US7346186B2 (en) 2008-03-18

Family

ID=26734906

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/056,049 Active 2023-11-05 US7346186B2 (en) 2001-01-30 2002-01-28 Video and audio content analysis system
US12/025,291 Active US7532744B2 (en) 2001-01-30 2008-02-04 Video and audio content analysis system
US12/418,675 Active US7751590B2 (en) 2001-01-30 2009-04-06 Video and audio content analysis system

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/025,291 Active US7532744B2 (en) 2001-01-30 2008-02-04 Video and audio content analysis system
US12/418,675 Active US7751590B2 (en) 2001-01-30 2009-04-06 Video and audio content analysis system

Country Status (1)

Country Link
US (3) US7346186B2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060028488A1 (en) * 2004-08-09 2006-02-09 Shay Gabay Apparatus and method for multimedia content based manipulation
US20060197666A1 (en) * 2005-02-18 2006-09-07 Honeywell International, Inc. Glassbreak noise detector and video positioning locator
US20060239645A1 (en) * 2005-03-31 2006-10-26 Honeywell International Inc. Event packaged video sequence
US20060262919A1 (en) * 2005-05-18 2006-11-23 Christopher Danson Method and system for analyzing separated voice data of a telephonic communication between a customer and a contact center by applying a psychological behavioral model thereto
US20070071404A1 (en) * 2005-09-29 2007-03-29 Honeywell International Inc. Controlled video event presentation
US20070194906A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation All hazard residential warning system
US20070195939A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation Fully Integrated Light Bar
US20070195706A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation Integrated municipal management console
US20070213088A1 (en) * 2006-02-22 2007-09-13 Federal Signal Corporation Networked fire station management
US20080240376A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for automatically routing a telephonic communication base on analytic attributes associated with prior telephonic communication
US20080240374A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for linking customer conversation channels
US20080240404A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for aggregating and analyzing data relating to an interaction between a customer and a contact center agent
US20080260122A1 (en) * 2005-05-18 2008-10-23 Kelly Conway Method and system for selecting and navigating to call examples for playback or analysis
US20090016575A1 (en) * 2007-07-13 2009-01-15 Samsung Electronics Co., Ltd. Real-time face recognition-based selective recording apparatus and method
US20090103709A1 (en) * 2007-09-28 2009-04-23 Kelly Conway Methods and systems for determining and displaying business relevance of telephonic communications between customers and a contact center
US20090119686A1 (en) * 2001-09-21 2009-05-07 Monroe David A Method and Apparatus for Interconnectivity Between Legacy Security Systems and Networked Multimedia Security Surveillance Systems
US20100026811A1 (en) * 2007-02-02 2010-02-04 Honeywell International Inc. Systems and methods for managing live video data
US20100239130A1 (en) * 2009-03-18 2010-09-23 Industrial Technology Research Institute System and method for performing rapid facial recognition
US7905640B2 (en) 2006-03-31 2011-03-15 Federal Signal Corporation Light bar and method for making
US20110201960A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US20110201959A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US20110201899A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US8023639B2 (en) 2007-03-30 2011-09-20 Mattersight Corporation Method and system determining the complexity of a telephonic communication received by a contact center
US8878931B2 (en) 2009-03-04 2014-11-04 Honeywell International Inc. Systems and methods for managing video data
US9346397B2 (en) 2006-02-22 2016-05-24 Federal Signal Corporation Self-powered light bar
US10129582B2 (en) 2015-06-30 2018-11-13 Kempt, LLC Systems, methods, and computer program products for capturing spectator content displayed at live events

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6819783B2 (en) * 1996-09-04 2004-11-16 Centerframe, Llc Obtaining person-specific images in a public venue
JP2003087771A (en) * 2001-09-07 2003-03-20 Oki Electric Ind Co Ltd Monitoring system and monitoring method
US20030154270A1 (en) * 2002-02-12 2003-08-14 Loss Prevention Management, Inc., New Mexico Corporation Independent and integrated centralized high speed system for data management
US20040233983A1 (en) * 2003-05-20 2004-11-25 Marconi Communications, Inc. Security system
US8086448B1 (en) * 2003-06-24 2011-12-27 Creative Technology Ltd Dynamic modification of a high-order perceptual attribute of an audio signal
US8626514B2 (en) * 2004-08-31 2014-01-07 Emc Corporation Interface for management of multiple auditory communications
US8244542B2 (en) * 2004-07-01 2012-08-14 Emc Corporation Video surveillance
US7499531B2 (en) 2003-09-05 2009-03-03 Emc Corporation Method and system for information lifecycle management
US8229904B2 (en) * 2004-07-01 2012-07-24 Emc Corporation Storage pools for information management
US8209185B2 (en) * 2003-09-05 2012-06-26 Emc Corporation Interface for management of auditory communications
US20060004818A1 (en) * 2004-07-01 2006-01-05 Claudatos Christopher H Efficient information management
US8180743B2 (en) * 2004-07-01 2012-05-15 Emc Corporation Information management
US9268780B2 (en) * 2004-07-01 2016-02-23 Emc Corporation Content-driven information lifecycle management
US8103873B2 (en) 2003-09-05 2012-01-24 Emc Corporation Method and system for processing auditory communications
US20060004579A1 (en) * 2004-07-01 2006-01-05 Claudatos Christopher H Flexible video surveillance
US7444287B2 (en) * 2004-07-01 2008-10-28 Emc Corporation Efficient monitoring system and method
US7707037B2 (en) * 2004-07-01 2010-04-27 Emc Corporation Archiving of surveillance data
US7457396B2 (en) * 2003-09-05 2008-11-25 Emc Corporation Automated call management
US8180742B2 (en) 2004-07-01 2012-05-15 Emc Corporation Policy-based information management
EP1519314A1 (en) * 2003-09-25 2005-03-30 Siemens Building Technologies AG Method and analysis tool for checking functionality of video surveillance devices and measuring system for carrying out the method
US7783513B2 (en) * 2003-10-22 2010-08-24 Intellisist, Inc. Business performance and customer care quality measurement
US20050114379A1 (en) * 2003-11-25 2005-05-26 Lee Howard M. Audio/video service quality analysis of customer/agent interaction
US8295446B1 (en) * 2004-09-03 2012-10-23 Confinement Telephony Technology, Llc Telephony system and method with enhanced call monitoring, recording and retrieval
US20060238616A1 (en) * 2005-03-31 2006-10-26 Honeywell International Inc. Video image processing appliance manager
EP1889480A1 (en) * 2005-05-27 2008-02-20 Overview Limited Apparatus, system and method for processing and transferring captured video data
GB0510890D0 (en) * 2005-05-27 2005-07-06 Overview Ltd Apparatus, system and method for processing and transferring captured video data
US7671728B2 (en) * 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
CN103824407B (en) * 2006-06-02 2017-05-24 传感电子有限责任公司 Distributed system and method for remotely monitoring field point
JP2008310895A (en) * 2007-06-15 2008-12-25 Hitachi Maxell Ltd Data recorder and data recording system using the same
JP5390322B2 (en) * 2009-09-28 2014-01-15 株式会社東芝 Image processing apparatus and image processing method
US8698888B2 (en) * 2009-10-30 2014-04-15 Medical Motion, Llc Systems and methods for comprehensive human movement analysis
JP5147874B2 (en) * 2010-02-10 2013-02-20 日立オートモティブシステムズ株式会社 In-vehicle image processing device
WO2011114668A1 (en) * 2010-03-18 2011-09-22 パナソニック株式会社 Data processing device and data processing method
WO2013184667A1 (en) 2012-06-05 2013-12-12 Rank Miner, Inc. System, method and apparatus for voice analytics of recorded audio
US9836118B2 (en) 2015-06-16 2017-12-05 Wilson Steele Method and system for analyzing a movement of a person
US10192418B1 (en) 2018-06-11 2019-01-29 Geoffrey M. Kern System and method for perimeter security

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751707A (en) * 1995-06-19 1998-05-12 Bell Atlantic Network Services, Inc. AIN interaction through wireless digital video network
US5987154A (en) * 1993-07-19 1999-11-16 Lucent Technologies Inc. Method and means for detecting people in image sequences
US6246320B1 (en) * 1999-02-25 2001-06-12 David A. Monroe Ground link with on-board security surveillance system for aircraft and other commercial vehicles
US6275855B1 (en) * 1997-11-02 2001-08-14 R. Brent Johnson System, method and article of manufacture to enhance computerized alert system information awareness and facilitate real-time intervention services
US6330025B1 (en) * 1999-05-10 2001-12-11 Nice Systems Ltd. Digital video logging system
US6697103B1 (en) * 1998-03-19 2004-02-24 Dennis Sunga Fernandez Integrated network for monitoring remote objects
US6826173B1 (en) * 1999-12-30 2004-11-30 At&T Corp. Enhanced subscriber IP alerting

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5987154A (en) * 1993-07-19 1999-11-16 Lucent Technologies Inc. Method and means for detecting people in image sequences
US5751707A (en) * 1995-06-19 1998-05-12 Bell Atlantic Network Services, Inc. AIN interaction through wireless digital video network
US6275855B1 (en) * 1997-11-02 2001-08-14 R. Brent Johnson System, method and article of manufacture to enhance computerized alert system information awareness and facilitate real-time intervention services
US6697103B1 (en) * 1998-03-19 2004-02-24 Dennis Sunga Fernandez Integrated network for monitoring remote objects
US6246320B1 (en) * 1999-02-25 2001-06-12 David A. Monroe Ground link with on-board security surveillance system for aircraft and other commercial vehicles
US6330025B1 (en) * 1999-05-10 2001-12-11 Nice Systems Ltd. Digital video logging system
US6856343B2 (en) * 1999-05-10 2005-02-15 Nice Systems Ltd. Digital video logging system
US6826173B1 (en) * 1999-12-30 2004-11-30 At&T Corp. Enhanced subscriber IP alerting

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090119686A1 (en) * 2001-09-21 2009-05-07 Monroe David A Method and Apparatus for Interconnectivity Between Legacy Security Systems and Networked Multimedia Security Surveillance Systems
US20060028488A1 (en) * 2004-08-09 2006-02-09 Shay Gabay Apparatus and method for multimedia content based manipulation
US7714878B2 (en) 2004-08-09 2010-05-11 Nice Systems, Ltd. Apparatus and method for multimedia content based manipulation
US20060197666A1 (en) * 2005-02-18 2006-09-07 Honeywell International, Inc. Glassbreak noise detector and video positioning locator
US7812855B2 (en) * 2005-02-18 2010-10-12 Honeywell International Inc. Glassbreak noise detector and video positioning locator
US7760908B2 (en) 2005-03-31 2010-07-20 Honeywell International Inc. Event packaged video sequence
US20060239645A1 (en) * 2005-03-31 2006-10-26 Honeywell International Inc. Event packaged video sequence
US20060262919A1 (en) * 2005-05-18 2006-11-23 Christopher Danson Method and system for analyzing separated voice data of a telephonic communication between a customer and a contact center by applying a psychological behavioral model thereto
US9432511B2 (en) 2005-05-18 2016-08-30 Mattersight Corporation Method and system of searching for communications for playback or analysis
US10104233B2 (en) 2005-05-18 2018-10-16 Mattersight Corporation Coaching portal and methods based on behavioral assessment data
US9225841B2 (en) 2005-05-18 2015-12-29 Mattersight Corporation Method and system for selecting and navigating to call examples for playback or analysis
US9692894B2 (en) 2005-05-18 2017-06-27 Mattersight Corporation Customer satisfaction system and method based on behavioral assessment data
US8094803B2 (en) 2005-05-18 2012-01-10 Mattersight Corporation Method and system for analyzing separated voice data of a telephonic communication between a customer and a contact center by applying a psychological behavioral model thereto
US20080260122A1 (en) * 2005-05-18 2008-10-23 Kelly Conway Method and system for selecting and navigating to call examples for playback or analysis
US20070071404A1 (en) * 2005-09-29 2007-03-29 Honeywell International Inc. Controlled video event presentation
US9878656B2 (en) 2006-02-22 2018-01-30 Federal Signal Corporation Self-powered light bar
US9002313B2 (en) 2006-02-22 2015-04-07 Federal Signal Corporation Fully integrated light bar
US20070213088A1 (en) * 2006-02-22 2007-09-13 Federal Signal Corporation Networked fire station management
US7746794B2 (en) 2006-02-22 2010-06-29 Federal Signal Corporation Integrated municipal management console
US20070195706A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation Integrated municipal management console
US20070195939A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation Fully Integrated Light Bar
US20070194906A1 (en) * 2006-02-22 2007-08-23 Federal Signal Corporation All hazard residential warning system
US9346397B2 (en) 2006-02-22 2016-05-24 Federal Signal Corporation Self-powered light bar
US20110156589A1 (en) * 2006-03-31 2011-06-30 Federal Signal Corporation Light bar and method for making
US7905640B2 (en) 2006-03-31 2011-03-15 Federal Signal Corporation Light bar and method for making
US8636395B2 (en) 2006-03-31 2014-01-28 Federal Signal Corporation Light bar and method for making
US9550453B2 (en) 2006-03-31 2017-01-24 Federal Signal Corporation Light bar and method of making
US20100026811A1 (en) * 2007-02-02 2010-02-04 Honeywell International Inc. Systems and methods for managing live video data
US9172918B2 (en) 2007-02-02 2015-10-27 Honeywell International Inc. Systems and methods for managing live video data
US9270826B2 (en) 2007-03-30 2016-02-23 Mattersight Corporation System for automatically routing a communication
US8023639B2 (en) 2007-03-30 2011-09-20 Mattersight Corporation Method and system determining the complexity of a telephonic communication received by a contact center
US10129394B2 (en) 2007-03-30 2018-11-13 Mattersight Corporation Telephonic communication routing system based on customer satisfaction
US8718262B2 (en) 2007-03-30 2014-05-06 Mattersight Corporation Method and system for automatically routing a telephonic communication base on analytic attributes associated with prior telephonic communication
US20080240376A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for automatically routing a telephonic communication base on analytic attributes associated with prior telephonic communication
US20080240374A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for linking customer conversation channels
US9699307B2 (en) 2007-03-30 2017-07-04 Mattersight Corporation Method and system for automatically routing a telephonic communication
US8891754B2 (en) 2007-03-30 2014-11-18 Mattersight Corporation Method and system for automatically routing a telephonic communication
US8983054B2 (en) 2007-03-30 2015-03-17 Mattersight Corporation Method and system for automatically routing a telephonic communication
US20080240404A1 (en) * 2007-03-30 2008-10-02 Kelly Conway Method and system for aggregating and analyzing data relating to an interaction between a customer and a contact center agent
US9124701B2 (en) 2007-03-30 2015-09-01 Mattersight Corporation Method and system for automatically routing a telephonic communication
US20090016575A1 (en) * 2007-07-13 2009-01-15 Samsung Electronics Co., Ltd. Real-time face recognition-based selective recording apparatus and method
US8300898B2 (en) * 2007-07-13 2012-10-30 Samsung Electronics Co., Ltd. Real-time face recognition-based selective recording apparatus and method
US20090103709A1 (en) * 2007-09-28 2009-04-23 Kelly Conway Methods and systems for determining and displaying business relevance of telephonic communications between customers and a contact center
US10419611B2 (en) 2007-09-28 2019-09-17 Mattersight Corporation System and methods for determining trends in electronic communications
US8878931B2 (en) 2009-03-04 2014-11-04 Honeywell International Inc. Systems and methods for managing video data
US20100239130A1 (en) * 2009-03-18 2010-09-23 Industrial Technology Research Institute System and method for performing rapid facial recognition
US8244002B2 (en) 2009-03-18 2012-08-14 Industrial Technology Research Institute System and method for performing rapid facial recognition
US20110201960A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US9138186B2 (en) 2010-02-18 2015-09-22 Bank Of America Corporation Systems for inducing change in a performance characteristic
US8715178B2 (en) 2010-02-18 2014-05-06 Bank Of America Corporation Wearable badge with sensor
US8715179B2 (en) 2010-02-18 2014-05-06 Bank Of America Corporation Call center quality management tool
US20110201899A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US20110201959A1 (en) * 2010-02-18 2011-08-18 Bank Of America Systems for inducing change in a human physiological characteristic
US10129582B2 (en) 2015-06-30 2018-11-13 Kempt, LLC Systems, methods, and computer program products for capturing spectator content displayed at live events

Also Published As

Publication number Publication date
US20090196462A1 (en) 2009-08-06
US20080144887A1 (en) 2008-06-19
US7532744B2 (en) 2009-05-12
US7751590B2 (en) 2010-07-06
US20020110264A1 (en) 2002-08-15

Similar Documents

Publication Publication Date Title
CN100336052C (en) Security system and operating method thereof
KR100915847B1 (en) Streaming video bookmarks
US8972481B2 (en) Playlist generation method and apparatus
US5946445A (en) Media recorder for capture and playback of live and prerecorded audio and/or video information
US7292979B2 (en) Time ordered indexing of audio data
EP1659518A2 (en) Automatic face extraction
US20100020172A1 (en) Performing real-time analytics using a network processing solution able to directly ingest ip camera video streams
US9729989B2 (en) Home automation sound detection and positioning
US7015943B2 (en) Premises entry security system
US5400011A (en) Method and apparatus for enhancing remote audio monitoring in security systems
US6748481B1 (en) Streaming information appliance with circular buffer for receiving and selectively reading blocks of streaming information
US8917186B1 (en) Audio monitoring and sound identification process for remote alarms
US6378035B1 (en) Streaming information appliance with buffer read and write synchronization
US7877438B2 (en) Method and apparatus for identifying new media content
US20050110634A1 (en) Portable security platform
JP2014504112A (en) Information processing using a set of data acquisition devices
US6330025B1 (en) Digital video logging system
EP1332479B1 (en) Integrated security system
KR100873316B1 (en) Multi video device control and expansion method and apparatus
US6820144B2 (en) Data format for a streaming information appliance
CA2381960C (en) System and method for digital video management
US6463486B1 (en) System for handling streaming information using a plurality of reader modules by enumerating output pins and associated streams of information
EP1391859A1 (en) Digital video securtiy system
US6658091B1 (en) LIfestyle multimedia security system
US7817716B2 (en) Method and/or apparatus for analyzing the content of a surveillance image

Legal Events

Date Code Title Description
AS Assignment

Owner name: NICE SYSTEMS, LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHARONI, DAVID;KATZ, HAGAI;KATZMAN, YEHUDA;AND OTHERS;REEL/FRAME:012835/0175

Effective date: 20020422

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: QOGNIFY LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NICE SYSTEMS LTD.;REEL/FRAME:036615/0243

Effective date: 20150918

AS Assignment

Owner name: MONROE CAPITAL MANAGEMENT ADVISORS, LLC, ILLINOIS

Free format text: SECURITY INTEREST;ASSIGNORS:QOGNIFY LTD.;ON-NET SURVEILLANCE SYSTEMS INC.;REEL/FRAME:047871/0771

Effective date: 20181228

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 12