WO2023279140A1 - Visual display system - Google Patents

Visual display system Download PDF

Info

Publication number
WO2023279140A1
WO2023279140A1 PCT/AU2022/050668 AU2022050668W WO2023279140A1 WO 2023279140 A1 WO2023279140 A1 WO 2023279140A1 AU 2022050668 W AU2022050668 W AU 2022050668W WO 2023279140 A1 WO2023279140 A1 WO 2023279140A1
Authority
WO
WIPO (PCT)
Prior art keywords
activity
display
clips
clip
display window
Prior art date
Application number
PCT/AU2022/050668
Other languages
French (fr)
Inventor
Matthew Macfarlane
Kevin Brown
Sarah PAYNE
Anupiya NUGALIYADDE
Original Assignee
Icetana Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Icetana Limited filed Critical Icetana Limited
Publication of WO2023279140A1 publication Critical patent/WO2023279140A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19671Addition of non-video data, i.e. metadata, to video stream
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/36Monitoring, i.e. supervising the progress of recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over

Definitions

  • the invention relates to a visual display system.
  • the invention is particularly suited for displaying video footage taken from one or more security cameras.
  • Security cameras can be generally categorised according to their intended purpose.
  • the first category of security cameras are those trained on a particular object, such as a door.
  • the second category of security cameras are those trained on an area, such as a carpark.
  • the intent of both categories of security cameras remains the same - to capture video of any activity that falls within the security camera’s field of view.
  • the video synopsis approach superimposes a prominent image from one clip over the prominent images of one or more clips, with the resulting images each being overlayed against a common static background image. Each prominent image is timestamped, allowing security personnel to place the clip in context. Clicking on the prominent image or timestamp will then cause the associated clip to be played in full.
  • a video display system for displaying activity clips in a set of activity clips
  • the video display system comprising: a processing sub-system; and a visual display sub-system configured to display a matrix of at least two display windows, where the processing sub-system processes a number of activity clips in the set of activity clips equal to the number of display windows in the matrix such that each of these activity clips is played in a display window of the matrix simultaneous with each other activity clip in the matrix and where, on each subsequent completion of display of an activity clip in a display window, the processing sub-system operates to display in that same display window the next activity clip in the set of activity clips until all activity clips in the set of activity clips have been displayed in a display window.
  • the visual display sub-system may comprise one or more display units, each display unit operable to display a set of display windows.
  • the visual display sub-system comprises a single display unit operable to display a set of nine display windows arranged as a 3x3 matrix.
  • the processing sub-system may operate to display a static image for a predetermined length of time in each display window before playing an activity clip.
  • the static image may also be used as part of a general “splash screen” before playing even the initial activity clips.
  • Each activity clip in the set of activity clips may include metadata and the processing sub-system is operable to filter the activity clips in the set of activity clips based on user defined criteria relating to the metadata.
  • This metadata may also be superimposed by the processing sub-system over the playback of the activity clip.
  • the processing sub-system may operate to display this metadata in a band of the display window separate from the band of the display window in which the activity clip is being played.
  • the metadata concerned may include such elements as the security camera that recorded the activity clip; the location of the security camera that recorded the activity clip; the date and time when the activity clip was recorded; the length of the recorded activity clip (in seconds); a categorisation of the anomaly shown in the activity clip; a priority assessment of the anomaly shown in the activity clip; a textual description of the activity being shown in the activity clip.
  • the processing subsystem may operate to display one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played.
  • the one or more action buttons may be superimposed over the playback of the activity clip.
  • the visual display sub-system is divided into bands, the matrix of display windows being displayed in one band and one or more action buttons are displayed in another band.
  • the action buttons displayed may be one or more of the following: a slider timeline; an approval action button; a playback action button; a save action button; a watch later action button; a watched action button; a highlights action button.
  • the video display system may further comprise an input sub-system.
  • the processing sub-system operates to cease displaying the activity clip in the remaining display windows and use the visual display sub-system to solely display the activity clip being displayed in the selected display window and, to recommence displaying the activity clips in the remaining display windows when the activity clip shown in the selected display window has completed or the user again selects the enlarged display window.
  • the display window may be enlarged to provide a greater viewing area and/or higher resolution view of the activity clip being displayed therein.
  • the video display system may also include an additional display window.
  • the additional display window operable to show a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed.
  • a method of displaying activity clips from a set of activity clips comprising the steps of: processing an initial number of activity clips in the set of activity clips equal to a number of display windows in a matrix; simultaneously playing these initial activity clips in separate display windows; on completing playback of each activity clip, playing the next activity clip in the set of activity clip in the same display window as the now completed activity clip until each activity clip in the set of activity clips is or has been displayed in a display window.
  • the method may also include the step of displaying a static image for a predetermined length of time in each display window before playing an activity clip.
  • Each activity clip in the set of activity clips may include metadata.
  • the method may also include the step of filtering the activity clips in the set of activity clips based on user defined criteria relating to the metadata.
  • the method may also include the step of superimposing one or more elements of the metadata over the playback of the activity clip.
  • the method may include the steps of: dividing each display window into bands; and displaying one or more elements of the metadata in a band of the display window separate from the band of the display window in which the activity clip is being played.
  • the method may also include the steps of: dividing each display window into bands; and displaying one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played.
  • the method can also include the step of superimposing one or more action buttons over the playback of the activity clip.
  • the method may include the steps of: selecting a display window using an input sub-system; cease displaying activity clips in each display window other than the selected display window; displaying the activity clip playing in the selected display window in an enlarged window that obscures at least one other display window; reducing the selected display window back to its original size and position on completion of display of the activity clip or an appropriate user action; and recommence playing of the activity clips shown in each display window other than the selected display window.
  • the method may also include the step of displaying a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed in an additional display window.
  • Figure 1 is a schematic of a video display system according to a first embodiment of the invention.
  • Figure 2 is a schematic of a video display system according to a second embodiment of the invention.
  • Figure 3 is a schematic of a video display system according to a third embodiment of the invention.
  • the visual display system 10 of this embodiment is designed to display selected video footage 12 taken by a set of security cameras 1.
  • Each security camera 1 in the set of security cameras 1 is in data communication with a parsing server 2.
  • the parsing server 2 parses the raw video footage taken by each security camera 1 to create a set of activity clips 3 (i.e. the relevant video footage 12).
  • Each activity clip 3 represents recorded video footage taken by a security camera 1 which the parsing server 2 has determined as anomalous according to its own internal processing rules.
  • parsing server 2 It is to be appreciated by the person skilled in the art that the actual hardware that forms the parsing server 2 is immaterial to the present invention. Similarly, the internal processing rules used to determine what parts of the raw data received from a security camera 1 are anomalous and, thus, should form an activity clip 3 to be included in the set of activity clips 3, are also immaterial to the present invention. In this manner, the parsing server 2 should only be viewed as an external entity that provides the video footage that is input for the present invention.
  • the visual display system 10 comprises a processing sub-system 14, an input sub-system 16 and at least one display unit 18.
  • the processing sub-system 14 is in data and control communication with the input sub-system 16 and each display unit 18.
  • the processing sub-system 14 is also in data communication with the parsing server 2.
  • Each display unit 18 has a usable display area 20.
  • the usable display areas 20 of each display unit 18 are arranged so as to be in close proximity to one another and face the working area 4 of security personnel (not shown).
  • the processing sub-system 14 divides the usable display areas 20 of each display unit 18 into at least two display windows 22.
  • the processing sub-system 14 divides the usable display area 20 of each display unit 18a, 18b, 18c into three (3) display windows 22.
  • the display windows 22 are arranged parallel to one another, such that each window 22 has a left display window 22a, a central display window 22b and a right display window 22c.
  • the parsing server 2 receives raw video footage from the security cameras 1 and processes them as programmed to produce a set of activity clips 3.
  • the set of activity clips 3 are then sent to the visual display system 10, where they are received by the processing sub-system 14.
  • each activity clip 3 in the set of activity clips 3 includes metadata 24 in addition to the relevant video footage.
  • This metadata 24 may include such information as:
  • the processing sub-system 14 processes each activity clip 3 in the set of activity clips 3 in the order set by the parsing server 2.
  • the first frame of the first activity clip 3 in the set of activity clips 3 is displayed in left display window 22a of display unit 18a. Processing continues in a similar manner to fill up each display window 22a, 22b, 22c of each display unit 18a, 18b, 18c.
  • the first frame of the ninth activity clip 3 in the set of activity clips 3 is displayed in right display window 22c of display unit 18c.
  • the processing subsystem 14 also operates to display metadata 24 in the upper right corner of each display window 22.
  • the metadata includes the security camera 1 that recorded the footage and the date and time that the footage was recorded.
  • the processing sub-system 14 awaits user input by way of the input sub-system 16.
  • the input sub-system includes a mouse 26 and the awaited user input is a left mouse click.
  • the input sub-system 16 When the input sub-system 16 records a left mouse click as having occurred, the input sub-system 16 sends a command signal back to the processing sub-system 14 to initiate display of each video. On receipt of the command signal, the processing sub-system 14 operates to simultaneously commence playing of each of the displayed activity clips 3. In this manner, each displayed activity clip 3 is synchronised with each other displayed activity clip 3.
  • the synchronisation referred to here is synchronisation of playback (i.e. the first second of each activity clip 3 is played at the same time, the second second of each activity clip 3 is played at the same time, etc.).
  • the synchronised playback of each activity clip 3 is independent of the date and time that it was recorded.
  • the processing sub-system 14 also operates to overlay the video footage being displayed in each display unit 18a, 18b, 18c with the amount of time that the video footage has been playing.
  • the display unit 18 on which that activity clip 3 has been displayed sends a completion signal back to the processing sub-system 14.
  • the completion signal provides details of the display unit 18 and the display window 22 where the now-finished activity clip 3 was displayed.
  • the display unit 18 operates to display a static image (not shown) in the relevant display window 22 until either a new activity clip 3 is displayed in the display window 22 or all activity clips 3 in the set of activity clips 3 have been displayed in full.
  • the processing sub-system 14 operates to display the next activity clip 3 in the set of activity clips 3 in the now vacated display unit 18/display window 22 combination. As part of this display, the processing sub-system 14 also operates to overlay the associated metadata 24 for that activity clip as already described.
  • the 22 can be considered an individual cell in a matrix. After the initial set of activity clips 3 have been distributed to each cell, the next activity clip 3 in the set of activity clips 3 is delivered to the now vacant cell for display.
  • the video display system 200 includes the following modifications.
  • Each display window 22a, 22b, 22c is divided into an upper band 202, a display band 204 and a lower band 206.
  • the activity clip 3 to be displayed in each display window 22a, 22b, 22c is shown in the display band 204.
  • the upper band 202 is used to display a set of approval action buttons 208.
  • the set of approval action buttons 208 comprise a positive approval activity button 208a and a negative approval action button 208b. Each of the approval action buttons 208 uses an appropriate image to indicate its intended function.
  • the lower band 206 is used to display a set of clip action buttons 210 and a playback action button 212.
  • the set of clip action buttons 210 comprise a save action button 210a, a share action button 210b, a watch later action button 210c, a watched action button 210d and a highlight action button 21 Oe.
  • Each of the clip action buttons 210, and the playback action button 212 uses an appropriate image to indicate its intended function.
  • the video display system 200 includes an additional display unit 214.
  • This additional display unit 214 only has a single display window 216.
  • the action clips 3 in the set of action clips 3 are displayed in the same manner as described in the first embodiment. However, while being displayed, a user may use the input sub-system 16 to action one or more of the action buttons 208, 210, 212 located in either the upper band 202 or the lower band 206 of the relevant display window 22. [0057] If the user actions one of the approval action buttons 208a, 208b, the input subsystem 16 sends an appropriate command signal (not shown) back to the processing subsystem 14 indicative of the approval action button 208 having been actioned by the user. On receipt of this command signal, the processing sub-system 14 records the user feedback and, if necessary, performs such further actions as have been prescribed according to the chosen action. In this respect, the person skilled in the art should appreciate that the further actions may act as feedback for elements outside of the display system 200, such as the parsing server 2, and therefore are not relevant to the present invention.
  • the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the playback action button 212 has been pressed.
  • the processing sub-system 12 increments the current play speed of the activity clip 3 by one magnitude. However, if the current play speed of the activity clip 3 is at its highest magnitude, the processing sub-system 12 sets the current play speed to its lowest magnitude rather than increasing the current play speed by one magnitude. It is expected that the person skilled in the art would be knowledgeable of how to adjust the play speed of an activity clip 3 in line with a specified play speed setting.
  • the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the save button 210a has been pressed.
  • the processing sub-system 12 operates to save the activity clip 3 currently being displayed within the display band 204 to the default save location.
  • the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the share button 210a has been pressed.
  • the processing sub-system 12 operates to send the activity clip 3 currently being displayed within the display band 204 to a default external device, such as the user’s mobile phone.
  • the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the watch later button 210c has been pressed. On receipt of this command signal, the processing sub-system 12 operates to add the activity clip 3 to a playlist entitled “Watch Later” which can then be actioned later by the user through the current system or through other playback systems (not shown). [0062] If the user actions the watched button 21 Od, the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the watched button 21 Od has been pressed. On receipt of this command signal, the processing sub-system 12 operates to stop displaying the activity clip 3 currently being displayed within the display band 206.
  • the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the highlight button 21 Oe has been pressed.
  • the processing sub-system 12 creates a highlight set (not shown) of action clips 3 (if not already created). Once created, the processing sub-system 12 adds the activity clip 3 presently being displayed within the display band 206, and its associated metadata 24, to the highlight set. In this manner, the highlight set is practically a subset of the set of action clips 3.
  • the additional display unit 214 is used to display the first frame of the upcoming action clips 3 from the set of action clips 3 to be displayed next.
  • the first frame of the next five (5) action clips 3 are displayed via display window 216.
  • these five action clips 3 are displayed vertically in the order in which they are to be displayed via the display windows 22, i.e. the first frame of the activity clip 3 to be displayed on completion of the next activity clip 3 currently being displayed is the top frame displayed.
  • the processing sub-system 12 operates to display each activity clip 3 in the highlight set via a display unit 22.
  • the sequence in which the action clips 3 in the highlight set are displayed is a first-in first-out basis as determined by reference to the sequence in which the user has initiated action of the highlight button 21 Oe. This means that the highlight set may display action clips 3 in a different order to the original order of display of the same action clips 3 as taken from the set of action clips 3.
  • the highlight set is saved as a playlist entitled “Highlights” (or added to such a playlist if it already exists) which can then be actioned later by the user through the current system or through other playback systems.
  • a video display system 300 In accordance with a third embodiment of the invention, where like numerals reference like parts, there is a video display system 300.
  • the video display system 300 is a reconfigured version of the second embodiment of the invention excluding the additional display unit 214.
  • each display unit 18 has an upper band 302, a display band
  • the lower band 306 is used to display the set of clip action buttons 210 and the playback action button 212.
  • the action clips 3 in the set of action clips 3 are displayed in the same manner as described in the first embodiment in each of the display windows 22 contained within the display band 304.
  • An approval action button 308 is superimposed over the action clip 3 in the top right corner of each display window 22.
  • the approval action button presents as an outlined star (indicative of an unapproved state) and a solid star (when indicative of an approved state).
  • the user may action the approval action button 308 in the same manner as actioning an approval action button 208a, 208b and the processing sub-system 14 acts in a similar manner to that described in the second embodiment to record the user’s feedback.
  • the processing sub-system 14 again acts in a similar manner to that described in the second embodiment for the appropriate action button 210, 212 actioned.
  • the effect of actioning an action button 210 or playback action button 212 applies to each action clip 3 displayed in the display windows 22 contained within the display band 304 rather than just a single action clip 3.
  • the user may click on any of the display windows 22. On doing so, the playback of each action clip 3 in the remaining display windows 22 is paused as the display window 22 is maximised to take up the whole display band 304.
  • the action clip 3 displayed in the selected display window 22 continues to play, but now in its enlarged form until such time as the action clip 3 either finishes, or the user again clicks on the now-enlarged display window 22.
  • the display window 22 returns to its original size and position to either display the next action clip 3 in the set of action clips 3 or continue to play the existing action clip 3 (as appropriate).
  • the action clips 3 shown in the remaining display windows 22 then resume playback from their paused point.
  • the preferred configuration of the display windows 22 may vary from those described without departing from the scope of the invention.
  • the display windows may be arranged in a 4x3 or 6x2 matrix.
  • the amount of time that the static image is shown between display of activity clips 3 in a display window 22 may vary, but ideally, should be of at least sufficient duration so as to give an attentive user an understanding that the previous activity clip 3 has finished.
  • the static image may also be shown for a predetermined period of time before the first frame of each activity clip 3 is displayed to the user.
  • the static image is shown in each display window 22 rather than the first frame of each activity clip 3 until such time as the input sub-system 16 records a left mouse click as having been occurred.
  • the static images are thereafter replaced with the relevant activity clips 3 that proceed to play straight away.
  • the second embodiment describes the parsing server 2 as ordering the set of activity clips 3 according to time of recordal
  • other factors may be used as the basis for ordering of the set of activity clips. For example, duration of each activity clip 3; an assigned anomaly level of each activity clip 3; an assigned priority level of each activity clip 3; an assigned category of each activity clip 3; security camera 1 that recorded the activity clip 3.
  • Certain metadata 24 described in the embodiment as being superimposed over the video footage by the processing sub-system 14 may, in fact, be superimposed over the video footage by the security camera 1, or by the parsing server 2. Accordingly, it is not an essential element of the invention that the processing sub-system 14 superimpose or overlay metadata 24 on the activity clip 3.
  • a user may be able, using the input sub-system 16, to create a filtered set of activity clips 3 with the video display system 10, 200, 300 then operable to display activity clips 3 taken from the filtered set rather than from the full set of activity clips 3.
  • the user may be able to create a filtered set of activity clips 3 based on specified date/time ranges. In this manner, while the invention is intended to have most benefit when the set of activity clips 3 covers anomalous footage taken over a twenty four (24) hour period, other time periods can be covered (such as a seventy two (72) hours period over a closed weekend).
  • the user may be able to create a filtered set of activity clips 3 based on one or more metadata 24 value(s).
  • the metadata 24 of an activity clip 3 may indicate the location of the security camera 1, and the set of activity clips 3 may then be filtered to only include such activity clips 3 as have been recorded from security cameras designated as having an “outside” location.
  • the processing sub-system 14 may perform analytics on each activity clip 3 in the set of activity clips 3.
  • the information generated by the analytical processes undertaken may form part of the metadata 24, or be used as the basis for additional reporting.
  • the processing sub-system 14 may also superimpose a textual description of the anomalous activity as shown in the activity clip 3 over the video footage. Ideally, but not necessarily, this textual description is displayed centrally at the bottom of each display window 22.
  • the textual description may be provided by the parsing server 2 or generated by the processing sub-system 14 following analysis of the activity clip 3.
  • a user may use the input sub-system 16 to adjust the playback quality, speed, or resolution, of the video being displayed in a display window 22.
  • a user may use the input sub-system 16 to adjust the playback quality, speed, or resolution, of a specific activity clip 3 being displayed.
  • the processing sub-system 14 may schedule the sequential display of each activity clip 3 in the set of activity clips 3 based on the duration of each clip, rather than waiting on the receipt of completion signals. Under this approach, the scheduling may also be set such that the total duration of the activity clips 3 displayed by each display window 22 is as close to one another as is possible to avoid one display window 22 inadvertently displaying the longest duration activity clips 3.
  • the processing sub-system 14 may also display in each display window a slider timeline of the activity clip 3 being displayed.
  • the user can the use the input sub-system 16 to change the current frame of the activity clip 3 being displayed by adjusting the current position of the slider timeline.
  • the metadata 24 or textual description may be displayed within either the upper band 204 or the lower band 206.
  • the method by which action clips 3 forming part of the highlight set are displayed to a user may be the same method by which the original action clips 3 are so displayed, i.e. the action clips 3 of the highlight set may be displayed to the user using the invention as described herein.
  • the additional display 214 may be used to display metadata 24 associated with each of these action clips 3.
  • the input sub-system 16 may take differing forms to that described above.
  • the input sub-system may incorporate a keyboard or a stylus in addition to, or in place of, mouse 26.
  • the input sub-system may operate off the basis of physical commands captured by an attached camera or voice commands captured by a microphone.
  • parsing server 2 processing sub-system 14, input subsystem 16 and display units 18 have all been described as separate elements, it is possible for the functionality of each of these elements to be performed by a single processing system, such as a computer. Thus, it should be understood that one or more of the aforementioned elements may be integrated into a single apparatus without impacting on the invention.
  • a reference to a processing system such as a computer or server is a reference to all of the components that would naturally form part of that processing system in order to achieve its intended functions.
  • the whole system or parts thereof may be implemented through such devices as tablets or smart phones.
  • the watched button 21 Od may be used to facilitate the invention to be repeatedly watched and/or watched in sessions. In this manner, repeated or resumed views of the set of activity clips 3 will not show any activity clip 3 that has already been marked as watched by way of the watched button 210d.
  • the additional display 214 may be omitted with the method by which a preview of upcoming activity clips 3 in the set of activity clips are displayed to the user being actioned through an additional display window 22.
  • Each display window 22 may be provided with a coloured outline.
  • the colour of the coloured outline may be used to communicate some aspect of the metadata 24 to the user.
  • the colour of the outline may be used to communicate the category of anomaly detected by the parsing server 2.
  • the upcoming activity clips 3 being displayed by way of the additional display 214 may be re-ordered by a user using the input sub-system 16. In this manner, if an upcoming activity clip 3 is deemed a priority viewing by the user, the user can override the order of the set of activity clips 3 decided by the video display system 200, 300.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

A video display system (10) for displaying activity clips (3) in a set of activity clips, the video display system (10) comprising a processing sub-system (14) and a visual display sub-system (18). The visual display sub-system (18) is configured to display a matrix of at least two display windows (22). The processing sub-system (14) initially processes a number of activity clips (3) in the set of activity clips equal to the number of display windows (22) in the matrix such that each of these activity clips (3) is played in a display window (22) of the matrix simultaneous with each other activity clip (3) in the matrix. Subsequently, on completion of display of each activity clip (3) in a display window (22), the processing sub-system (14) operates to display in that same display window (22) the next activity clip (3) in the set of activity clips until all activity clips (3) in the set of activity clips have been displayed in a display window (22).

Description

“VISUAL DISPLAY SYSTEM”
FIELD OF THE INVENTION
[0001] The invention relates to a visual display system. The invention is particularly suited for displaying video footage taken from one or more security cameras.
BACKGROUND TO THE INVENTION
[0002] The following discussion of the background to the invention is intended to facilitate an understanding of the present invention. However, it should be appreciated that the discussion is not an acknowledgment or admission that any of the material referred to was published, known or part of the common general knowledge in any jurisdiction as at the priority date of the application.
[0003] Security cameras can be generally categorised according to their intended purpose. The first category of security cameras are those trained on a particular object, such as a door. The second category of security cameras are those trained on an area, such as a carpark. The intent of both categories of security cameras, however, remains the same - to capture video of any activity that falls within the security camera’s field of view.
[0004] The problem with security cameras (of either category) is that they often run for extended periods of time with little or no activity actually being recorded. Reviewing the recorded footage to get to any footage of interest is a tedious and unproductive task - particularly when the camera works on a twenty-four hour, seven days a week basis. Considering that commercial and/or governmental premises may operate a suite of security cameras, the amount of time required to review security camera footage showing no change in activity can mean that one or more persons may need to be employed solely to perform this task.
[0005] One method of seeking to resolve this problem has been the introduction of security cameras that operate to record video footage of the object or area when motion has been detected in the camera’s field of view. This attempted solution means that there is no longer video footage of a static image that needs to be traversed before reaching footage of an activity of potential importance. However, the footage that is captured may be of incidental or routine activities of no relevance - the review of which is an unproductive use of resources, namely, the time of security personnel.
[0006] For this reason, even more recent solutions have sought to parse the security footage captured by each security camera with the intent of only showing to security personnel selected footage showing anomalous activity. This approach, while solving the original problem, presents a new problem of how to display such footage to the security personnel in a manner which allows them to quickly assess the relevance of each clip of selected footage.
[0007] One method of displaying these clips is the video synopsis approach such as that described in Australian Patent 2006314066 filed by Yissum Research Development Company of the Hebrew University of Jerusalem titled “Method and System for Producing a Video Synopsis”.
[0008] The video synopsis approach superimposes a prominent image from one clip over the prominent images of one or more clips, with the resulting images each being overlayed against a common static background image. Each prominent image is timestamped, allowing security personnel to place the clip in context. Clicking on the prominent image or timestamp will then cause the associated clip to be played in full.
[0009] While this approach has merit when a security camera has recorded only a few instances of anomalous activity, if there has been a large number of such instances, the resulting display can be overly crowded. Additionally, where there are a large number of prominent images to be overlayed for display to security personnel, the problem of how such images should be overlayed in a manner that clearly distinguishes each to the security personnel also arises.
[0010] It is also to be noted that as the video synopsis approach seeks to overlay images against a common static background image, this approach to displaying relevant footage is, in effect, tied to a security camera. Thus, security personnel must watch the clips of anomalous activity taken by each security camera on a sequential basis.
[0011] It is therefore an object of the present invention to provide an alternative means for displaying selected video footage taken by a set of security cameras.
SUMMARY OF THE INVENTION
[0012] Throughout this document, unless otherwise indicated to the contrary, the terms “comprising”, “consisting of”, and the like, are to be construed as non-exhaustive, or in other words, as meaning “including, but not limited to”.
[0013] I n accordance with a first aspect of the present invention there is a video display system for displaying activity clips in a set of activity clips, the video display system comprising: a processing sub-system; and a visual display sub-system configured to display a matrix of at least two display windows, where the processing sub-system processes a number of activity clips in the set of activity clips equal to the number of display windows in the matrix such that each of these activity clips is played in a display window of the matrix simultaneous with each other activity clip in the matrix and where, on each subsequent completion of display of an activity clip in a display window, the processing sub-system operates to display in that same display window the next activity clip in the set of activity clips until all activity clips in the set of activity clips have been displayed in a display window.
[0014] The visual display sub-system may comprise one or more display units, each display unit operable to display a set of display windows. In a preferred arrangement, the visual display sub-system comprises a single display unit operable to display a set of nine display windows arranged as a 3x3 matrix.
[0015] The processing sub-system may operate to display a static image for a predetermined length of time in each display window before playing an activity clip. In this manner, in addition to displaying the static image in between images, the static image may also be used as part of a general “splash screen” before playing even the initial activity clips.
[0016] Each activity clip in the set of activity clips may include metadata and the processing sub-system is operable to filter the activity clips in the set of activity clips based on user defined criteria relating to the metadata. This metadata may also be superimposed by the processing sub-system over the playback of the activity clip. Alternatively, where each display window is divided into bands, the processing sub-system may operate to display this metadata in a band of the display window separate from the band of the display window in which the activity clip is being played. The metadata concerned may include such elements as the security camera that recorded the activity clip; the location of the security camera that recorded the activity clip; the date and time when the activity clip was recorded; the length of the recorded activity clip (in seconds); a categorisation of the anomaly shown in the activity clip; a priority assessment of the anomaly shown in the activity clip; a textual description of the activity being shown in the activity clip.
[0017] Again, where each display window is divided into bands, the processing subsystem may operate to display one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played. Alternatively, the one or more action buttons may be superimposed over the playback of the activity clip.
[0018] In an alternative arrangement, the visual display sub-system is divided into bands, the matrix of display windows being displayed in one band and one or more action buttons are displayed in another band. [0019] The action buttons displayed may be one or more of the following: a slider timeline; an approval action button; a playback action button; a save action button; a watch later action button; a watched action button; a highlights action button.
[0020] The video display system may further comprise an input sub-system. When a user selects a display window using the input sub-system, the processing sub-system operates to cease displaying the activity clip in the remaining display windows and use the visual display sub-system to solely display the activity clip being displayed in the selected display window and, to recommence displaying the activity clips in the remaining display windows when the activity clip shown in the selected display window has completed or the user again selects the enlarged display window. On selection of a display window in this manner, the display window may be enlarged to provide a greater viewing area and/or higher resolution view of the activity clip being displayed therein.
[0021] The video display system may also include an additional display window. The additional display window operable to show a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed.
[0022] In accordance with a second aspect of the invention there is a method of displaying activity clips from a set of activity clips, the method comprising the steps of: processing an initial number of activity clips in the set of activity clips equal to a number of display windows in a matrix; simultaneously playing these initial activity clips in separate display windows; on completing playback of each activity clip, playing the next activity clip in the set of activity clip in the same display window as the now completed activity clip until each activity clip in the set of activity clips is or has been displayed in a display window.
[0023] The method may also include the step of displaying a static image for a predetermined length of time in each display window before playing an activity clip.
[0024] Each activity clip in the set of activity clips may include metadata. In such an arrangement, the method may also include the step of filtering the activity clips in the set of activity clips based on user defined criteria relating to the metadata. Alternatively, or cumulatively, the method may also include the step of superimposing one or more elements of the metadata over the playback of the activity clip. In a further variation, the method may include the steps of: dividing each display window into bands; and displaying one or more elements of the metadata in a band of the display window separate from the band of the display window in which the activity clip is being played.
[0025] The method may also include the steps of: dividing each display window into bands; and displaying one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played.
[0026] The method can also include the step of superimposing one or more action buttons over the playback of the activity clip.
[0027] In yet a further arrangement, the method may include the steps of: selecting a display window using an input sub-system; cease displaying activity clips in each display window other than the selected display window; displaying the activity clip playing in the selected display window in an enlarged window that obscures at least one other display window; reducing the selected display window back to its original size and position on completion of display of the activity clip or an appropriate user action; and recommence playing of the activity clips shown in each display window other than the selected display window.
[0028] The method may also include the step of displaying a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed in an additional display window.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] The invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic of a video display system according to a first embodiment of the invention.
Figure 2 is a schematic of a video display system according to a second embodiment of the invention.
Figure 3 is a schematic of a video display system according to a third embodiment of the invention.
PREFERRED EMBODIMENTS OF THE INVENTION [0030] In accordance with a first embodiment of the invention there is a visual display system 10. The visual display system 10 of this embodiment is designed to display selected video footage 12 taken by a set of security cameras 1.
[0031] Each security camera 1 in the set of security cameras 1 is in data communication with a parsing server 2. The parsing server 2 parses the raw video footage taken by each security camera 1 to create a set of activity clips 3 (i.e. the relevant video footage 12). Each activity clip 3 represents recorded video footage taken by a security camera 1 which the parsing server 2 has determined as anomalous according to its own internal processing rules.
[0032] It is to be appreciated by the person skilled in the art that the actual hardware that forms the parsing server 2 is immaterial to the present invention. Similarly, the internal processing rules used to determine what parts of the raw data received from a security camera 1 are anomalous and, thus, should form an activity clip 3 to be included in the set of activity clips 3, are also immaterial to the present invention. In this manner, the parsing server 2 should only be viewed as an external entity that provides the video footage that is input for the present invention.
[0033] The visual display system 10 comprises a processing sub-system 14, an input sub-system 16 and at least one display unit 18. The processing sub-system 14 is in data and control communication with the input sub-system 16 and each display unit 18. The processing sub-system 14 is also in data communication with the parsing server 2.
[0034] Each display unit 18 has a usable display area 20. In this embodiment, the usable display areas 20 of each display unit 18 are arranged so as to be in close proximity to one another and face the working area 4 of security personnel (not shown).
[0035] The processing sub-system 14 divides the usable display areas 20 of each display unit 18 into at least two display windows 22. In this first embodiment, there are three (3) display units 18a, 18b, 18c arranged in a curve relative to the working area 4 as shown in Figure 1. The processing sub-system 14 divides the usable display area 20 of each display unit 18a, 18b, 18c into three (3) display windows 22. The display windows 22 are arranged parallel to one another, such that each window 22 has a left display window 22a, a central display window 22b and a right display window 22c.
[0036] This embodiment of the invention will now be described in the context of its intended use. [0037] The parsing server 2 receives raw video footage from the security cameras 1 and processes them as programmed to produce a set of activity clips 3. The set of activity clips 3 are then sent to the visual display system 10, where they are received by the processing sub-system 14.
[0038] It is to be noted here that each activity clip 3 in the set of activity clips 3 includes metadata 24 in addition to the relevant video footage. This metadata 24 may include such information as:
• the security camera 1 that recorded the video footage;
• the date and time when the video footage was recorded;
• the length of the recorded video footage (in seconds);
• a categorisation of the anomaly as detected by the parsing server 2; and
• a priority assessment of the anomaly as detected by the parsing server 2.
[0039] The processing sub-system 14 processes each activity clip 3 in the set of activity clips 3 in the order set by the parsing server 2. The first frame of the first activity clip 3 in the set of activity clips 3 is displayed in left display window 22a of display unit 18a. Processing continues in a similar manner to fill up each display window 22a, 22b, 22c of each display unit 18a, 18b, 18c. To elaborate:
• The first frame of the second activity clip 3 in the set of activity clips 3 is displayed in central display window 22b of display unit 18a;
• The first frame of the third activity clip 3 in the set of activity clips 3 is displayed in right display window 22c of display unit 18a;
• The first frame of the fourth activity clip 3 in the set of activity clips 3 is displayed in left display window 22a of display unit 18b;
• The first frame of the fifth activity clip 3 in the set of activity clips 3 is displayed in central display window 22b of display unit 18b;
• The first frame of the sixth activity clip 3 in the set of activity clips 3 is displayed in right display window 22c of display unit 18b;
• The first frame of the seventh activity clip 3 in the set of activity clips 3 is displayed in left display window 22a of display unit 18c;
• The first frame of the eighth activity clip 3 in the set of activity clips 3 is displayed in central display window 22b of display unit 18c;
• The first frame of the ninth activity clip 3 in the set of activity clips 3 is displayed in right display window 22c of display unit 18c. [0040] In addition to displaying the first frame of the relevant activity clip 3 in its appropriate display window 22, the processing subsystem 14 also operates to display metadata 24 in the upper right corner of each display window 22. In this embodiment, the metadata includes the security camera 1 that recorded the footage and the date and time that the footage was recorded.
[0041] Once the first frame of each of these first nine activity clips 3 is displayed via the display units 18a, 18b, 18c and the associated metadata 24 overlayed, the processing sub-system 14 awaits user input by way of the input sub-system 16. In this embodiment, the input sub-system includes a mouse 26 and the awaited user input is a left mouse click.
[0042] When the input sub-system 16 records a left mouse click as having occurred, the input sub-system 16 sends a command signal back to the processing sub-system 14 to initiate display of each video. On receipt of the command signal, the processing sub-system 14 operates to simultaneously commence playing of each of the displayed activity clips 3. In this manner, each displayed activity clip 3 is synchronised with each other displayed activity clip 3.
[0043] It is to be noted that the synchronisation referred to here is synchronisation of playback (i.e. the first second of each activity clip 3 is played at the same time, the second second of each activity clip 3 is played at the same time, etc.). In this manner, the synchronised playback of each activity clip 3 is independent of the date and time that it was recorded.
[0044] For this embodiment, the processing sub-system 14 also operates to overlay the video footage being displayed in each display unit 18a, 18b, 18c with the amount of time that the video footage has been playing.
[0045] When an activity clip 3 has finished being displayed, the display unit 18 on which that activity clip 3 has been displayed sends a completion signal back to the processing sub-system 14. The completion signal provides details of the display unit 18 and the display window 22 where the now-finished activity clip 3 was displayed. Furthermore, to indicate to a user that an activity clip 3 has completed, the display unit 18 operates to display a static image (not shown) in the relevant display window 22 until either a new activity clip 3 is displayed in the display window 22 or all activity clips 3 in the set of activity clips 3 have been displayed in full.
[0046] In response to this completion signal, the processing sub-system 14 operates to display the next activity clip 3 in the set of activity clips 3 in the now vacated display unit 18/display window 22 combination. As part of this display, the processing sub-system 14 also operates to overlay the associated metadata 24 for that activity clip as already described.
[0047] In this manner, it is to be appreciated that each display unit 18/display window
22 can be considered an individual cell in a matrix. After the initial set of activity clips 3 have been distributed to each cell, the next activity clip 3 in the set of activity clips 3 is delivered to the now vacant cell for display.
[0048] The above process repeats until such time as the last activity clip 3 in the set of activity clips 3 has been displayed in full.
[0049] In accordance with a second embodiment of the invention, where like numerals reference like parts, there is a video display system 200. In this embodiment, the video display system 200 includes the following modifications.
[0050] Each display window 22a, 22b, 22c is divided into an upper band 202, a display band 204 and a lower band 206. The activity clip 3 to be displayed in each display window 22a, 22b, 22c is shown in the display band 204.
[0051] The upper band 202 is used to display a set of approval action buttons 208.
The set of approval action buttons 208 comprise a positive approval activity button 208a and a negative approval action button 208b. Each of the approval action buttons 208 uses an appropriate image to indicate its intended function.
[0052] The lower band 206 is used to display a set of clip action buttons 210 and a playback action button 212.
[0053] The set of clip action buttons 210 comprise a save action button 210a, a share action button 210b, a watch later action button 210c, a watched action button 210d and a highlight action button 21 Oe. Each of the clip action buttons 210, and the playback action button 212, uses an appropriate image to indicate its intended function.
[0054] In this embodiment, the video display system 200 includes an additional display unit 214. This additional display unit 214 only has a single display window 216.
[0055] This second embodiment of the invention will now be described in the context of its intended use.
[0056] The action clips 3 in the set of action clips 3 are displayed in the same manner as described in the first embodiment. However, while being displayed, a user may use the input sub-system 16 to action one or more of the action buttons 208, 210, 212 located in either the upper band 202 or the lower band 206 of the relevant display window 22. [0057] If the user actions one of the approval action buttons 208a, 208b, the input subsystem 16 sends an appropriate command signal (not shown) back to the processing subsystem 14 indicative of the approval action button 208 having been actioned by the user. On receipt of this command signal, the processing sub-system 14 records the user feedback and, if necessary, performs such further actions as have been prescribed according to the chosen action. In this respect, the person skilled in the art should appreciate that the further actions may act as feedback for elements outside of the display system 200, such as the parsing server 2, and therefore are not relevant to the present invention.
[0058] If the user actions the playback action button 212, the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the playback action button 212 has been pressed. On receipt of this command signal, the processing sub-system 12 increments the current play speed of the activity clip 3 by one magnitude. However, if the current play speed of the activity clip 3 is at its highest magnitude, the processing sub-system 12 sets the current play speed to its lowest magnitude rather than increasing the current play speed by one magnitude. It is expected that the person skilled in the art would be knowledgeable of how to adjust the play speed of an activity clip 3 in line with a specified play speed setting.
[0059] If the user actions the save action button 210a, the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the save button 210a has been pressed. On receipt of this command signal, the processing sub-system 12 operates to save the activity clip 3 currently being displayed within the display band 204 to the default save location.
[0060] If the user actions the share action button 210b, the input sub-system 16 sends an appropriate command signal back (not shown) back to the processing sub-system 14 indicative that the share button 210a has been pressed. On receipt of this command signal, the processing sub-system 12 operates to send the activity clip 3 currently being displayed within the display band 204 to a default external device, such as the user’s mobile phone.
[0061] If the user actions the watch later button 210c, the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the watch later button 210c has been pressed. On receipt of this command signal, the processing sub-system 12 operates to add the activity clip 3 to a playlist entitled “Watch Later” which can then be actioned later by the user through the current system or through other playback systems (not shown). [0062] If the user actions the watched button 21 Od, the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the watched button 21 Od has been pressed. On receipt of this command signal, the processing sub-system 12 operates to stop displaying the activity clip 3 currently being displayed within the display band 206.
[0063] If the user actions the highlight button 210e, the input sub-system 16 sends an appropriate command signal (not shown) back to the processing sub-system 14 indicative that the highlight button 21 Oe has been pressed. On receipt of this command signal, the processing sub-system 12 creates a highlight set (not shown) of action clips 3 (if not already created). Once created, the processing sub-system 12 adds the activity clip 3 presently being displayed within the display band 206, and its associated metadata 24, to the highlight set. In this manner, the highlight set is practically a subset of the set of action clips 3.
[0064] While each of the display units 14 are showing action clips 3 in their display windows 22, the additional display unit 214 is used to display the first frame of the upcoming action clips 3 from the set of action clips 3 to be displayed next. In this embodiment, the first frame of the next five (5) action clips 3 are displayed via display window 216. Furthermore, these five action clips 3 are displayed vertically in the order in which they are to be displayed via the display windows 22, i.e. the first frame of the activity clip 3 to be displayed on completion of the next activity clip 3 currently being displayed is the top frame displayed.
[0065] Once all action clips 3 from the set of action clips 3 have been displayed in full, if the user has actioned the highlight button 210e at any time, the user is asked whether they wish to view the action clips 3 that form the highlight set. If they indicate that they do wish to view the highlight set of action clips 3 at that time, the processing sub-system 12 operates to display each activity clip 3 in the highlight set via a display unit 22. The sequence in which the action clips 3 in the highlight set are displayed is a first-in first-out basis as determined by reference to the sequence in which the user has initiated action of the highlight button 21 Oe. This means that the highlight set may display action clips 3 in a different order to the original order of display of the same action clips 3 as taken from the set of action clips 3.
[0066] If the user decides not to view the action clips 3 that form the highlight set on completion of the display of all action clips 3 in the set of action clips 3, the highlight set is saved as a playlist entitled “Highlights” (or added to such a playlist if it already exists) which can then be actioned later by the user through the current system or through other playback systems. [0067] In accordance with a third embodiment of the invention, where like numerals reference like parts, there is a video display system 300. The video display system 300 is a reconfigured version of the second embodiment of the invention excluding the additional display unit 214.
[0068] In this embodiment each display unit 18 has an upper band 302, a display band
304 and a lower band 306. Multiple display windows 22 are shown within the display band 304.
[0069] The lower band 306 is used to display the set of clip action buttons 210 and the playback action button 212.
[0070] This third embodiment of the invention will now be described in the context of its intended use.
[0071] The action clips 3 in the set of action clips 3 are displayed in the same manner as described in the first embodiment in each of the display windows 22 contained within the display band 304. An approval action button 308 is superimposed over the action clip 3 in the top right corner of each display window 22. In this embodiment, the approval action button presents as an outlined star (indicative of an unapproved state) and a solid star (when indicative of an approved state).
[0072] The user may action the approval action button 308 in the same manner as actioning an approval action button 208a, 208b and the processing sub-system 14 acts in a similar manner to that described in the second embodiment to record the user’s feedback.
[0073] Similarly, when the user actions any of the action buttons 210 or the playback action button 212, the processing sub-system 14 again acts in a similar manner to that described in the second embodiment for the appropriate action button 210, 212 actioned. However, in this embodiment, the effect of actioning an action button 210 or playback action button 212 applies to each action clip 3 displayed in the display windows 22 contained within the display band 304 rather than just a single action clip 3.
[0074] During display of the action clips 3, the user may click on any of the display windows 22. On doing so, the playback of each action clip 3 in the remaining display windows 22 is paused as the display window 22 is maximised to take up the whole display band 304. The action clip 3 displayed in the selected display window 22 continues to play, but now in its enlarged form until such time as the action clip 3 either finishes, or the user again clicks on the now-enlarged display window 22. [0075] When the action clip 3 finishes, or the user again clicks on the now-enlarged display window 22, the display window 22 returns to its original size and position to either display the next action clip 3 in the set of action clips 3 or continue to play the existing action clip 3 (as appropriate). The action clips 3 shown in the remaining display windows 22 then resume playback from their paused point.
[0076] It should be appreciated by the person skilled in the art that the above invention is not limited to the embodiment described. In particular, the following modifications and improvements may be made without departing from the scope of the present invention:
• [0077] While the invention has been described in the context of three (3) display units 18 each split into three display windows 22, there is no reason why the invention could not use any number of display units 18 with any number of display windows 22 (excepting a single display unit 18 showing a single display window 22). For instance, in its preferred configuration, a single display unit 18 is divided into nine (9) display windows 22 arranged in a 3x3 matrix.
• [0078] Similarly, the preferred configuration of the display windows 22 may vary from those described without departing from the scope of the invention. For instance, the display windows may be arranged in a 4x3 or 6x2 matrix.
• [0079] The amount of time that the static image is shown between display of activity clips 3 in a display window 22 may vary, but ideally, should be of at least sufficient duration so as to give an attentive user an understanding that the previous activity clip 3 has finished.
• [0080] The static image may also be shown for a predetermined period of time before the first frame of each activity clip 3 is displayed to the user. In a further alternative arrangement, initially, the static image is shown in each display window 22 rather than the first frame of each activity clip 3 until such time as the input sub-system 16 records a left mouse click as having been occurred. When a left mouse click occurs in this arrangement, the static images are thereafter replaced with the relevant activity clips 3 that proceed to play straight away.
• [0081] While the second embodiment describes the parsing server 2 as ordering the set of activity clips 3 according to time of recordal, other factors may be used as the basis for ordering of the set of activity clips. For example, duration of each activity clip 3; an assigned anomaly level of each activity clip 3; an assigned priority level of each activity clip 3; an assigned category of each activity clip 3; security camera 1 that recorded the activity clip 3. • [0082] Certain metadata 24 described in the embodiment as being superimposed over the video footage by the processing sub-system 14 may, in fact, be superimposed over the video footage by the security camera 1, or by the parsing server 2. Accordingly, it is not an essential element of the invention that the processing sub-system 14 superimpose or overlay metadata 24 on the activity clip 3.
• [0083] A user may be able, using the input sub-system 16, to create a filtered set of activity clips 3 with the video display system 10, 200, 300 then operable to display activity clips 3 taken from the filtered set rather than from the full set of activity clips 3. For instance, the user may be able to create a filtered set of activity clips 3 based on specified date/time ranges. In this manner, while the invention is intended to have most benefit when the set of activity clips 3 covers anomalous footage taken over a twenty four (24) hour period, other time periods can be covered (such as a seventy two (72) hours period over a closed weekend).
• [0084] In a similar manner, the user may be able to create a filtered set of activity clips 3 based on one or more metadata 24 value(s). For example, the metadata 24 of an activity clip 3 may indicate the location of the security camera 1, and the set of activity clips 3 may then be filtered to only include such activity clips 3 as have been recorded from security cameras designated as having an “outside” location.
• [0085] The processing sub-system 14 may perform analytics on each activity clip 3 in the set of activity clips 3. The information generated by the analytical processes undertaken may form part of the metadata 24, or be used as the basis for additional reporting.
• [0086] The processing sub-system 14 may also superimpose a textual description of the anomalous activity as shown in the activity clip 3 over the video footage. Ideally, but not necessarily, this textual description is displayed centrally at the bottom of each display window 22. The textual description may be provided by the parsing server 2 or generated by the processing sub-system 14 following analysis of the activity clip 3.
• [0087] A user may use the input sub-system 16 to adjust the playback quality, speed, or resolution, of the video being displayed in a display window 22. Alternatively, a user may use the input sub-system 16 to adjust the playback quality, speed, or resolution, of a specific activity clip 3 being displayed.
• [0088] The processing sub-system 14 may schedule the sequential display of each activity clip 3 in the set of activity clips 3 based on the duration of each clip, rather than waiting on the receipt of completion signals. Under this approach, the scheduling may also be set such that the total duration of the activity clips 3 displayed by each display window 22 is as close to one another as is possible to avoid one display window 22 inadvertently displaying the longest duration activity clips 3.
• [0089] The processing sub-system 14 may also display in each display window a slider timeline of the activity clip 3 being displayed. The user can the use the input sub-system 16 to change the current frame of the activity clip 3 being displayed by adjusting the current position of the slider timeline.
• [0090] While elements such as the metadata 24 and textual description, etc. are most likely to be superimposed over the activity clip 3, this is not required. For instance, in a variation of the second embodiment, the metadata 24 or textual description may be displayed within either the upper band 204 or the lower band 206.
• [0091] In a variation of the second embodiment, the method by which action clips 3 forming part of the highlight set are displayed to a user may be the same method by which the original action clips 3 are so displayed, i.e. the action clips 3 of the highlight set may be displayed to the user using the invention as described herein.
• [0092] In addition to showing the first frame of each activity clip 3 forming part of the set of activity clips 3, the additional display 214 may be used to display metadata 24 associated with each of these action clips 3.
• [0093] The input sub-system 16 may take differing forms to that described above. For instance, the input sub-system may incorporate a keyboard or a stylus in addition to, or in place of, mouse 26. Alternatively, the input sub-system may operate off the basis of physical commands captured by an attached camera or voice commands captured by a microphone.
• [0094] While the parsing server 2, processing sub-system 14, input subsystem 16 and display units 18 have all been described as separate elements, it is possible for the functionality of each of these elements to be performed by a single processing system, such as a computer. Thus, it should be understood that one or more of the aforementioned elements may be integrated into a single apparatus without impacting on the invention.
• [0095] It should be further appreciated that a reference to a processing system, such as a computer or server is a reference to all of the components that would naturally form part of that processing system in order to achieve its intended functions. This includes storage means (such as hard disk drives), internal memory, display units (e.g. monitors) and other peripheral units as may be required (e.g. mouse and keyboards). In this manner, the whole system (or parts thereof) may be implemented through such devices as tablets or smart phones. • [0096] In an alternative arrangement, the watched button 21 Od may be used to facilitate the invention to be repeatedly watched and/or watched in sessions. In this manner, repeated or resumed views of the set of activity clips 3 will not show any activity clip 3 that has already been marked as watched by way of the watched button 210d.
• [0097] The additional display 214 may be omitted with the method by which a preview of upcoming activity clips 3 in the set of activity clips are displayed to the user being actioned through an additional display window 22.
• [0098] Each display window 22 may be provided with a coloured outline. The colour of the coloured outline may be used to communicate some aspect of the metadata 24 to the user. For example, the colour of the outline may be used to communicate the category of anomaly detected by the parsing server 2.
• [0099] The upcoming activity clips 3 being displayed by way of the additional display 214 may be re-ordered by a user using the input sub-system 16. In this manner, if an upcoming activity clip 3 is deemed a priority viewing by the user, the user can override the order of the set of activity clips 3 decided by the video display system 200, 300.
[0100] It should be further appreciated that even more embodiments of the invention incorporating one or more of the aforementioned features, where such features are not mutually exclusive, can be created without departing from the invention’s scope.

Claims

We Claim:
1. A video display system for displaying activity clips in a set of activity clips, the video display system comprising: a processing sub-system; and a visual display sub-system configured to display a matrix of at least two display windows, where the processing sub-system processes a number of activity clips in the set of activity clips equal to the number of display windows in the matrix such that each of these activity clips is played in a display window of the matrix simultaneous with each other activity clip in the matrix and where, on each subsequent completion of display of an activity clip in a display window, the processing sub-system operates to display in tate same display window the next activity clip in the set of activity clips until all activity clips in the set of activity clips have been displayed in a display window.
2. A video display system according to claim 1, where the visual display sub-system comprises one or more display units, each display unit operable to display a set of display windows.
3. A video display system according to claim 2, where the visual display sub-system comprises a single display unit operable to display a set of nine display windows arranged as a 3x3 matrix.
4. A video display system according to any preceding claim, where the processing subsystem operates to display a static image for a predetermined length of time in each display window before playing an activity clip.
5. A video display system according to any preceding claim, where each activity clip in the set of activity clips includes metadata and the processing sub-system is operable to filter the activity clips in the set of activity clips based on user defined criteria relating to the metadata.
6. A video display system according to any preceding claim, where each activity clip in the set of activity clips includes metadata and the processing sub-system is operable to superimpose one or more elements of the metadata over the playback of the activity clip.
7. A video display system according to any preceding claim, where each activity clip in the set of activity clips includes metadata and each display window is divided into bands, the processing sub-system operable to display one or more elements of the metadata in a band of the display window separate from the band of the display window in which the activity clip is being played.
8. A video display system according to any one of claims 5 to 7, where the metadata includes any of the following: the security camera that recorded the activity clip; the location of the security camera that recorded the activity clip; the date and time when the activity clip was recorded; the length of the recorded activity clip (in seconds); a categorisation of the anomaly shown in the activity clip; a priority assessment of the anomaly shown in the activity clip; a textual description of the activity being shown in the activity clip.
9. A video display system according to any one of claims 1 to 6, where each display window is divided into bands, the processing sub-system operable to display one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played.
10. A video display system according to any one of claims 1 to 8, where one or more action buttons are superimposed over the playback of the activity clip.
11. A video display system according to any preceding claim where the visual display subsystem is divided into bands, the matrix of display windows being displayed in one band and one or more action buttons are displayed in another band.
12. A video display system according to any one of claims 9 to 11, where the action buttons are one or more of the following: a slider timeline; an approval action button; a playback action button; a save action button; a watch later action button; a watched action button; a highlights action button.
13. A video display system according to any preceding claim, further comprising an input subsystem and where, when a user selects a display window using the input sub-system, the processing sub-system operates to cease displaying the activity clip in the remaining display windows and use the visual display sub-system to solely display the activity clip being displayed in the selected display window and, to recommence displaying the activity clips in the remaining display windows when the activity clip shown in the selected display window has completed or the user again selects the enlarged display window.
14. A video display system according to claim 13, where the selected display window is enlarged to obscure at least one other display window on selection and returns back to its original size and position on completion of display of the activity clip or an appropriate user action.
15. A video display system according to any preceding claim, further comprising an additional display window, the additional display window operable to show a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed.
16. A method of displaying activity clips from a set of activity clips, the method comprising the steps of: processing an initial number of activity clips in the set of activity clips equal to a number of display windows in a matrix; simultaneously playing these initial activity clips in separate display windows; on completing playback of each activity clip, playing the next activity clip in the set of activity clip in the same display window as the now completed activity clip until each activity clip in the set of activity clips is or has been displayed in a display window.
17. A method of displaying activity clips according to claim 16, further comprising the step of displaying a static image for a predetermined length of time in each display window before playing an activity clip.
18. A method of displaying activity clips according to claim 16 or claim 17, where each activity clip in the set of activity clips includes metadata, the method further comprising the step of filtering the activity clips in the set of activity clips based on user defined criteria relating to the metadata.
19. A method of displaying activity clips according to any one of claims 16 to claim 18, where each activity clip in the set of activity clips includes metadata, the method further comprising the step of superimposing one or more elements of the metadata over the playback of the activity clip.
20. A method of displaying activity clips according to any one of claims 16 to 19, where each activity clip in the set of activity clips includes metadata, the method further including the steps of: dividing each display window into bands; and displaying one or more elements of the metadata in a band of the display window separate from the band of the display window in which the activity clip is being played.
21. A method of displaying activity clips according to any one of claims 16 to 20, further comprising the steps of: dividing each display window into bands; and displaying one or more action buttons in a band of the display window separate from the band of the display window in which the activity clip is being played.
22. A method of displaying activity clips according to any one of claims 16 to 20, further comprising the step of superimposing one or more action buttons over the playback of the activity clip.
23. A method of displaying activity clips according to any one of claims 16 to 22, further comprising the steps of: selecting a display window using an input sub-system; cease displaying activity clips in each display window other than the selected display window; displaying the activity clip playing in the selected display window in an enlarged window that obscures at least one other display window; reducing the selected display window back to its original size and position on completion of display of the activity clip or an appropriate user action; and recommence playing of the activity clips shown in each display window other than the selected display window.
24. A method of displaying activity clips according to any one of claims 16 to 23, further comprising the step of displaying a summary image and/or metadata of at least the next activity clip in the set of activity clips to be displayed in an additional display window.
PCT/AU2022/050668 2021-07-07 2022-06-29 Visual display system WO2023279140A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021103923A AU2021103923A4 (en) 2021-07-07 2021-07-07 Video display system
AU2021103923 2021-07-07

Publications (1)

Publication Number Publication Date
WO2023279140A1 true WO2023279140A1 (en) 2023-01-12

Family

ID=77563702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/050668 WO2023279140A1 (en) 2021-07-07 2022-06-29 Visual display system

Country Status (2)

Country Link
AU (1) AU2021103923A4 (en)
WO (1) WO2023279140A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090327100A1 (en) * 2008-06-29 2009-12-31 TV1.com Holdings, LLC Method of Internet Video Access and Management
US20100097471A1 (en) * 2008-10-17 2010-04-22 Honeywell International Inc. Automated way to effectively handle an alarm event in the security applications
US20120257062A1 (en) * 2011-04-06 2012-10-11 Hon Hai Precision Industry Co., Ltd. Video switch system and method of viewing surveillance videos
US20140022382A1 (en) * 2012-07-18 2014-01-23 Vivotek Inc. Video setting method
US20160283074A1 (en) * 2008-08-20 2016-09-29 Honeywell International Inc. Infinite recursion of monitors in surveillance applications
US20170201724A1 (en) * 2010-11-05 2017-07-13 Razberi Technologies, Inc. System and method for a security system
US20180330590A1 (en) * 2012-10-26 2018-11-15 Sensormatic Electronics, LLC Transcoding mixing and distribution system and method for a video security system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090327100A1 (en) * 2008-06-29 2009-12-31 TV1.com Holdings, LLC Method of Internet Video Access and Management
US20160283074A1 (en) * 2008-08-20 2016-09-29 Honeywell International Inc. Infinite recursion of monitors in surveillance applications
US20100097471A1 (en) * 2008-10-17 2010-04-22 Honeywell International Inc. Automated way to effectively handle an alarm event in the security applications
US20170201724A1 (en) * 2010-11-05 2017-07-13 Razberi Technologies, Inc. System and method for a security system
US20120257062A1 (en) * 2011-04-06 2012-10-11 Hon Hai Precision Industry Co., Ltd. Video switch system and method of viewing surveillance videos
US20140022382A1 (en) * 2012-07-18 2014-01-23 Vivotek Inc. Video setting method
US20180330590A1 (en) * 2012-10-26 2018-11-15 Sensormatic Electronics, LLC Transcoding mixing and distribution system and method for a video security system

Also Published As

Publication number Publication date
AU2021103923A4 (en) 2021-09-09

Similar Documents

Publication Publication Date Title
US8121349B2 (en) Electronic apparatus and video processing method
US7986324B2 (en) Display device, display program storage medium and display method
US9064393B2 (en) Video playback method, terminal, and system
US9286777B2 (en) Presenting video data
US10269393B2 (en) Timeline synchronization control method for multiple display views
US8913834B2 (en) Acoustic signal corrector and acoustic signal correcting method
US8107791B2 (en) Display device, display program storage medium, and displaying method
US8879886B2 (en) Method and apparatus for configuring storage of video data from a plurality of sources
JP2009089065A (en) Electronic device and facial image display apparatus
CN101094395A (en) System and program for monitoring video image
KR101652856B1 (en) Apparatus for providing user interface screen based on control event in cctv
JP2006279119A (en) Image reproducing device and program
JP2005176030A (en) Video preservation system and video preservation method
AU2021103923A4 (en) Video display system
CN106899817B (en) Media data method for recording and device
KR20100062056A (en) A system for recording and analyzing of soccer game
EP4164215A1 (en) Video playback method and apparatus, and electronic device and computer-readable storage medium
KR20130124757A (en) Video management system and method of controlling video thereof
CN101232595A (en) Device and method for setting record time table of digital video recorder
KR101285855B1 (en) Method for one channel multi-searching in digital video recorder
KR100826685B1 (en) Apparatus and method for setting record schedule in digital video recorder
CN118283319A (en) Display device and play history statistics method
US20140212105A1 (en) Recording device and method for recording program
JP2004260389A (en) Computer readable recording medium with program recorded thereon, the program realieing motion recording/detection, multi-reproduction, schedule recording, sensor operation detection/recording or the like on the basis of input, compression and recording of digital video, remote transmission/reception video signal through narrow area network lan, wide area network wan or internet and realieing video signal recording, vide display, video retrieval, video reproduction, or the like and device capable of processing program
JP2007067618A (en) Automatic video recording instruction apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22836381

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE