US20160062992A1 - Shared server methods and systems for information storage, access, and security - Google Patents

Shared server methods and systems for information storage, access, and security Download PDF

Info

Publication number
US20160062992A1
US20160062992A1 US14/686,192 US201514686192A US2016062992A1 US 20160062992 A1 US20160062992 A1 US 20160062992A1 US 201514686192 A US201514686192 A US 201514686192A US 2016062992 A1 US2016062992 A1 US 2016062992A1
Authority
US
United States
Prior art keywords
media
recording
metadata information
processors
storage location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/686,192
Inventor
Allan Chen
Yun Long Tan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coban Technologies Inc
Original Assignee
Coban Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coban Technologies Inc filed Critical Coban Technologies Inc
Priority to US14/686,192 priority Critical patent/US20160062992A1/en
Assigned to COBAN TECHNOLOGIES, INC reassignment COBAN TECHNOLOGIES, INC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, ALLAN, TAN, YUN LONG
Priority to PCT/US2015/047532 priority patent/WO2016033523A1/en
Publication of US20160062992A1 publication Critical patent/US20160062992A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06F17/30023
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/0223User address space allocation, e.g. contiguous or non contiguous base addressing
    • G06F12/023Free address space management
    • G06F12/0238Memory management in non-volatile memory, e.g. resistive RAM or ferroelectric memory
    • G06F12/0246Memory management in non-volatile memory, e.g. resistive RAM or ferroelectric memory in block erasable memory, e.g. flash memory
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/14Protection against unauthorised use of memory or access to memory
    • G06F12/1408Protection against unauthorised use of memory or access to memory by using cryptography
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/13File access structures, e.g. distributed indices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F17/30091
    • G06F17/30289
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06F21/101Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM] by binding digital rights to specific entities
    • G06F21/1011Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM] by binding digital rights to specific entities to devices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/78Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure storage of data
    • G06F21/79Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure storage of data in semiconductor storage media, e.g. directly-addressable memories
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3226Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
    • H04L9/3231Biological data, e.g. fingerprint, voice or retina
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3236Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions
    • H04L9/3242Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions involving keyed hash functions, e.g. message authentication codes [MACs], CBC-MAC or HMAC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/214Specialised server platform, e.g. server located in an airplane, hotel, hospital
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • H04N21/2353Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • H04N21/2543Billing, e.g. for subscription services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25816Management of client data involving client authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/10Providing a specific technical effect
    • G06F2212/1016Performance improvement
    • G06F2212/1024Latency reduction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/10Providing a specific technical effect
    • G06F2212/1052Security improvement
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/72Details relating to flash memory management
    • G06F2212/7202Allocation control and policies
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/72Details relating to flash memory management
    • G06F2212/7207Details relating to flash memory management management of metadata or control data

Definitions

  • Metadata associated with either audio recordings or video recordings is a relatively small amount of data compared to the audio or video data.
  • today's systems typically embed the metadata as part of the audio or video data file such that access to the metadata requires access to the potentially large multi-media file.
  • most access programs require an entire file to understand the structure and content of the file itself. Accordingly, to access any metadata associated with a typical multi-media file, one must have complete access to the entire multi-media file.
  • a computer system configured to collect and manage metadata associated with one or more multi-media recordings.
  • the computer system includes one or more processors and one or more network communication interfaces communicatively coupled to the one or more processors.
  • the computer system also includes a storage area accessible to the one or more processors.
  • the storage area may be used to store executable instructions for the processor(s) and to store any collected (e.g., recorded) information.
  • these two types of data may be stored in separate logical areas of the storage area.
  • the computer system may be configured, by the executable instructions, to receive, via the one or more network communication interfaces, metadata information pertaining to at least one multi-media recording.
  • FIGS. 1A-B illustrate a rear view and a front view, respectively, of a device for capturing (e.g., recording) multi-media and metadata according to some disclosed embodiments.
  • FIG. 4 illustrates an intelligent docking, upload, and charging station for battery packs and portable recording devices according to some disclosed embodiments.
  • FIGS. 8A-F illustrate excerpts of metadata files using eXtensible Markup Language (XML) for the data format, according to some disclosed embodiments.
  • XML eXtensible Markup Language
  • Multi-media will be used throughout this disclosure to refer to files collected (e.g., recorded) by an audio or audio/video recorder.
  • Multi-media files may include only audio, only video, or audio and video together and the information may be compressed using an industry standard compression technology (e.g., Motion Picture Expert Group (MPEG) standards, Audio Video Interleave (AVI), etc.) or another proprietary compression or storage format.
  • MPEG Motion Picture Expert Group
  • AVI Audio Video Interleave
  • Embodiments of the present disclosure provide for management of multi-media files and associated metadata that might be collected by one or more, mobile surveillance systems, portable video recording devices, and other types of data recorders.
  • the mobile (and possibly stationary) surveillance system devices may be configured to capture video, audio, and data parameters pertaining to activity in the vicinity of the surveillance system, for example a police vehicle.
  • Other type of vehicles and other situations requiring a surveillance unit are also within the scope of this disclosure.
  • Other types of vehicles may include, but are not limited to, any transportation means equipped with a mobile surveillance system (e.g., civilian transport trucks).
  • the disclosed embodiments are explained in the context of mobile surveillance systems for vehicles that aid in law enforcement such as buses, ambulances, police motorcycles or bicycles, fire trucks, airplanes, boats, military vehicles, etc.
  • data collected from other types of vehicles including non law enforcement vehicles may be collected as a possible aid to law enforcement (or for other applicable uses), at least in part, because of the disclosed data mining and coordination techniques.
  • the evidence may go to a forensic laboratory prior to arriving at the courtroom.
  • Evidence admissibility in court is predicated upon an unbroken chain of custody. It is important to demonstrate that the evidence introduced at trial is the same evidence collected at the crime scene [e.g. that is, all access to the evidence (e.g., electronic files) was controlled and documented], and that the evidence was not altered in any way. Requirements for law enforcement are further described in “Criminal Justice Information Services (CJIS) Security Policy,” version 5.3 published Aug. 4, 2014 referenced as “CJISD-ITS-DOC-08140-5.3” which is hereby incorporated by reference in its entirety.
  • CJIS Central Justice Information Services
  • disclosed embodiments may allow for comprehensive back-office video management software to be provided using a Software as a Service (SaaS) architecture, giving each agency (even small remote agencies) the tools they need to capture, transfer, store and manage their digital video evidence from car to court. That is, the disclosed system and back-office management techniques meet the preservation of evidence requirements outlined above with respect to management of digital evidence for law enforcement. All activity with respect to digital evidence in the back-office system may be logged to ensure proper documentation of evidence handling.
  • the disclosed system may include electronic transfer of evidence in a controlled manner and may provide comprehensive coordination of potential evidence captured from a plurality of surveillance systems.
  • the disclosed system may also include integrated DVD burning software for easy and accurate evidence transfer.
  • USB port 140 may be provided for general peripheral connectivity and expansion according to some disclosed embodiments.
  • An integrated global positioning system (GPS) module 120 with optional external antenna or connector 115 is used in part for capturing location data, time sync, and speed logging. The GPS information may also be used for time synchronization and to coordinate data, ultimately facilitating map based search and synchronization (e.g., locate recorded information from a time and/or location across a plurality of recording devices).
  • Dual front facing cameras 125 may include both a wide angle video camera and a tight field of view camera for optical zoom effect snap shots.
  • a record indicator 130 provides an indication of a current operating mode for integrated system 100 .
  • a G-Sensor/Accelerometer may be used for impact detection and to automatically initiate record mode.
  • the G-Sensor/Accelerometer may also provide data logging for impact statistics and road condition data.
  • a DIO Digital Input/Output
  • the DIO can also be used to control external relays or other devices as appropriate.
  • the DIO can also be used to detect brake, light bar, car door, and gun lock so that the video recording can be automatically triggered.
  • a combination power button and brightness control 145 can be used to turn on the system and control the brightness of the monitor after the system is turned on.
  • Programmable function button 150 provides a user definable external button for easy access to instigate any function provided by integrated system 100 .
  • a user may define function button 150 to perform an action with one touch (e.g., instant replay, event tagging of a particular type, etc.).
  • a articulating touchscreen 165 may be used to view video in real-time, or in one or more play back modes.
  • Touchscreen 165 may also serve as an input mechanism, providing a user interface to integrated system 100 .
  • An integrated speaker (not shown) may be used for in-car audio monitoring and in-car video/audio file playback.
  • removable SSD Flash drive 170 e.g., secure digital (SD) or universal serial bus (USB) type
  • SD secure digital
  • USB universal serial bus
  • removable SSD flash drive 170 may be secured via a mechanical removable media key lock 160 .
  • event based data is recorded and written to the removable drive to be transferred to a back office server for storage and management.
  • Wireless microphone sync contacts 175 may be configured to synchronize a wireless microphone/camera, such as a body worn camera and microphone, for communication with integrated system 100 .
  • a wireless microphone/camera such as a body worn camera and microphone
  • other synchronization methods for wireless microphone/cameras include utilizing NFC or RFID capability between the wireless device and integrated system 100 .
  • integrated mobile surveillance system 100 may be configured to include functional components to provide operational characteristics that may include the following.
  • a pre-event playback function may be used to tag historical events.
  • integrated mobile surveillance system 100 may record continuously to internal storage and store tagged information (e.g., marked for export) to removable storage.
  • tagged information e.g., marked for export
  • the operator may instruct the system to navigate back to an earlier time captured in the internal storage and play back that portion of video/audio information.
  • the selected video at any available point in time, may be marked, tagged for extraction, and stored to removable storage, as if the event had been tagged at that point in time.
  • a component may provide an instant replay function configured to playback the last predetermined amount of time with one button press.
  • both the instant replay and pre-event playback (along with general system operation) allow for simultaneous playback while the system is concurrently recording information.
  • Pre-defined event tags and a pre-defined event tagging functions may also be provided.
  • tags may include DWI, felony, speeding, stop sign, chase, etc.
  • the tagging action may be used to catalog portions of recorded data. For example, after an event is indicated as ending (e.g., such as stop recording indication), an option to select a predefined event may be displayed. Upon selection the system may allow an associated portion of collected information to be marked in a text file for current and future identification and storage.
  • the tagged information when the tagged information is transferred to the data management software, the tagged information may be searched by event type and maintained on the server for a predefined retention period based on the event type.
  • a streaming function may also be provided to stream live view and recorded video, audio, and/or data over available wireless and wired networks.
  • the integrated system 100 may also integrate “hotspot” capabilities which allow the system to serve as an agency accessible, mobile wireless local area network (WLAN).
  • WLAN mobile wireless local area network
  • Example device 200 comprises a programmable control device 210 which may be optionally connected to input device 260 (e.g., keyboard, mouse, touchscreen, etc.), display 270 or program storage device 280 .
  • input device 260 e.g., keyboard, mouse, touchscreen, etc.
  • program storage device 280 included with programmable control device 210 is a network interface 240 for communication via a network with other computers and infrastructure devices (not shown).
  • network interface 240 may be included within programmable control device 210 or be external to programmable control device 210 . In either case, programmable control device 210 may be communicatively coupled to network interface 240 .
  • Program Storage Device (PSD) 280 represents any form of non-volatile storage including, but not limited to, all forms of optical and magnetic storage elements including solid-state storage.
  • Program control device 210 may be included in a device 200 and be programmed to perform methods, including hybrid storage of metadata and associated multi-media files, in accordance with this disclosure.
  • Program control device 210 comprises a processor unit (PU) 220 , input-output (I/O) interface 250 and memory 230 .
  • Processing unit (PU) 220 may include any programmable controller device including, for example, the Intel Core®, Pentium® and Celeron® processor families from Intel and the Cortex ARM processor families from ARM® (INTEL® CORE®, PENTIUM® and CELERON® are registered trademarks of the Intel Corporation.
  • CORTEX® is a registered trademark of the ARM Limited Corporation.
  • ARM® is a registered trademark of the ARM Limited Company).
  • Memory 230 may include one or more memory modules and comprise random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), programmable read-write memory, and solid state memory.
  • RAM random access memory
  • ROM read only memory
  • PROM programmable read only memory
  • PU 220 may also include some internal memory including, for example, cache memory.
  • Storage media as embodied in storage devices such as PSD 280 and memory internal to program control device 210 are suitable for tangibly embodying computer program instructions.
  • Storage media may include, but not be limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (DVDs); and semiconductor memory devices such as Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Gate Arrays and flash devices.
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • Programmable Gate Arrays Programmable Gate Arrays and flash devices.
  • FIG. 2B illustrates a secure digital (SD) card 285 that may be configured as the programmable storage device described above.
  • SD card is a nonvolatile memory card format for use in portable devices, such as mobile phones, digital cameras, handheld consoles, and tablet computers, etc.
  • An SD card may be inserted into a receptacle on the device conforming to the SD specification or may alternately be configured with an interface to allow plugging into a standard USB port (or other port).
  • An example of the adapter for USB compatibility 286 is illustrated in FIG. 2C .
  • Modern computer operating systems are typically configured to automatically permit access to an SD card when it is plugged into an active computer system (sometimes referred to as plug-n-play).
  • a plug and play device or computer bus is one with a specification that provides for or facilitates the discovery of a hardware component in a system without the need for physical device configuration or user intervention in resolving resource conflicts.
  • disclosed systems may incorporate a specifically modified interface to the removable storage drive utilized in device 100 (i.e., removable media 170 ). Modifications permitting specialized access to removable media, such as a secure storage drive, are described in co-pending U.S. patent application Ser. No. 14/588,139, entitled “Hidden Plug-in Storage Drive for Data Integrity,” by Hung C. Chang, which is incorporated by reference herein. Modifications permitting specialized functionality from removable media are described in co-pending U.S. patent application Ser. No. 14/593,722, entitled “Self-contained Storage Device for Self-contained Application Execution,” by Allan Chen et al., which is incorporated by reference herein.
  • block diagram 300 illustrates one embodiment of an integrated audio-video-data surveillance system. Note that each of the components shown in block diagram 300 may be communicatively coupled to other components via communication channels (e.g., bus) not shown in the block diagram.
  • the flow arrows of block diagram 300 are general in nature to illustrate the movement of information.
  • video and audio may be captured by camera 305 and microphone 306 respectively.
  • Captured data may be provided initially to video/audio encoder 310 to encode and optionally compress the raw video data and the encoded data may be stored in a memory area (not shown) for access by CPU 315 .
  • Encoded data may also be selectively stored to either internal failsafe hard drive 320 or removable mobile hard drive 325 individually or to both simultaneously.
  • Data may also be transferred, for example at the direction of a user, from internal failsafe hard drive 320 to removable mobile hard drive 325 .
  • Data capture devices such as general purpose input output (GPIO) 330 and GPS 331 may be used to capture metadata to associate with captured surveillance information (e.g., multi-media files). All pertinent captured metadata may be associated with captured video/audio recordings using structured text files such as, for example, eXtensible Markup Language (XML) files. An example of such structured text files is explained in more detail below with reference to FIGS. 8A-F .
  • XML files may be utilized to store many different types of metadata associated with captured video and data.
  • Metadata may be used to describe “recording circumstances” attributable to the surveillance information (e.g., multi-media recordings). That is, the metadata may describe, when, where, who, and why information, among other things, to indicate information about the act of recording the surveillance information.
  • the metadata may include, but not be limited to, timestamps of capture, [internal clock (not shown) of system 100 may be synchronized using GPS data] event tags, GPS coordinates, GPS and RADAR/LIDAR measurement from a target vehicle, breathalyzer analysis information, analytical information, and so on. Analytical information will be discussed in more detail below with reference to FIG. 7 .
  • Wireless interface 335 may be used to upload information from one or more surveillance systems to back office servers located, for example, at a police station or to cloud based resources. Back office servers and cloud based resources will be discussed in more detail below with reference to FIG. 6 .
  • advanced docking station 400 may provide additional benefits for users that maintain a plurality of portable body worn cameras 450 and/or a plurality of surveillance systems. Some or all portable body worn cameras 450 may incorporate one or more programmable function buttons 405 . As shown in FIG. 4 , docking station 400 may have multiple ports/cradles 415 . Docking station 400 may assist in data upload, device checkout, device upgrade (e.g., firmware/software update), recharging of battery packs 420 and other maintenance type functions that may be performed, for example, at a police station. For clarity, not all repeated elements in FIG. 4 have an associated reference number. Embodiments of the disclosed docking station may support maintenance functions for multiple portable devices such as body worn cameras 450 concurrently.
  • device upgrade e.g., firmware/software update
  • Embodiments of the disclosed docking station may support maintenance functions for multiple portable devices such as body worn cameras 450 concurrently.
  • the disclosed docking station 400 may be multifunctional for uploading and/or downloading of video/audio and associated metadata.
  • Configuration data such as unit ID, user ID, operational modes, updates, and so on, may be maintained and versions of such configuration information may be presented on display screen 410 (which may also be a touchscreen interface to docking station 400 ).
  • Docking station 400 may have integrated interfaces to different types of surveillance systems. Interfaces such as, USB, wired Ethernet or wireless network, as well as interface ports for battery charging may be included. Docking station 400 may also contain: a CPU and be configured as a computer device (see FIG. 1 ) with optional integrated touchscreen display 410 , output connectors (not shown) for an optional external display/mouse or device expansion. Docking station 400 may have an option for a wireless display (not shown) to be used for status indication as well as for an interface for checkout/assignment of surveillance system devices to a user or group of users (See FIG. 5 ). Docking station 400 may include wireless communications such as Bluetooth and/or 802.4AC/AD.
  • Docking station 400 may also be configured to work as an Access Point for a wireless network or may be configured to act as a bridge to allow portable client devices to access functionality of docking station 400 and possibly connect to other system components including local or cloud based servers. Docking station 400 may also include functional software or firmware modules to support hybrid storage of recorded multi-media and associated metadata automatically. Hybrid storage is discussed in more detail below with reference to FIG. 7 .
  • Docking station 400 may also have an internal storage device to facilitate fast off-load storage which may be used to facilitate a download/forward process for audio/video and metadata captured on a surveillance system device (e.g. the body worn camera 450 ).
  • a surveillance system device e.g. the body worn camera 450
  • the user may place the body worn camera 450 into a docking station cradle 415 and docking station 400 offloads the data to the local onboard storage drive (not shown) which can immediately (or based on a timer) upload that information, or a portion thereof if hybrid model, to a server (e.g., back office server or cloud storage). Uploads may be prioritized based on many different attributes such as time, size, event type priority, and so on.
  • Docking station 400 may also have an integrated locking mechanism for one or more of the uploading/charging ports/cradles 415 .
  • the docking station 400 may be configured to control the locking mechanism to hold or release the wearable device in order to prevent the user from taking it out during uploading/downloading, or to make sure that only the recently “checked out” device is removed, for example.
  • the touchscreen display 410 of FIG. 4 illustrates one possible graphical user interface (GUI) layout as an example only. Actual layouts may contain more information and features and may be configurable based on requirements of different end users. In FIG. 4 , the GUI shows examples of upload status and battery charging progress. Other screens may be available on the GUI display 410 to provide other status information such as unit ID, user ID, and/or to assist with user checkout and assignment of devices to different mobile surveillance systems.
  • GUI graphical user interface
  • process flow 500 illustrates a possible method for assisting law enforcement personnel with compliance to chain of custody of evidence requirements for legal evidence.
  • Chain of custody of evidence requirements may be implemented with the assistance of docking station 400 .
  • the computer device at the police station is considered to be docking station 400 (but may be another workstation type device for example) and a computer device in a police car, for example, will be referred to as a “mobile surveillance system.”
  • Both docking station 400 and the mobile surveillance system are examples embodiments of computer device 100 of FIG. 1 described above.
  • a portable recording apparatus e.g., body worn camera 450
  • a storage device e.g., 285 , 286
  • the portable recording device may be connected to docking station 400 that is configured to interact with the storage device of the portable recording device.
  • docking station 400 receives a request to assign a portable recording device (e.g., body worn camera 450 , or wireless microphone) to an officer (e.g., Officer “Joe Smith”) for use in a patrol “shift.”
  • the request may, for example, come from a GUI presented on touchscreen 410 .
  • the request may also include information to assign the portable recording device to a particular mobile surveillance system for that shift (e.g., surveillance system of “patrol car 54”).
  • docking station 400 writes control information to the storage device of portable recording device to identify an appropriate mobile device (e.g., 301 ).
  • the control information may include storage serial number, officer's ID (e.g., “Joe”), patrol car (e.g., “54”), officer's password (likely encrypted), recording parameter settings, or other information useful in assisting in audit tracking of the portable recording device and any information collected on the storage device of the portable recording device during the shift.
  • the portable recording device is removed from docking station 400 for association with a mobile surveillance system (e.g., 301 ).
  • the portable recording device (e.g., 450 ) is now in a “checked out” state.
  • the officer authenticates to a mobile surveillance system.
  • the portable recording device is connected to the mobile surveillance system at block 530 .
  • the storage device of the portable recording device e.g., 450
  • the storage device of the portable recording device becomes accessible to the mobile surveillance system if authentication information is accurate.
  • Authentication may require that the mobile surveillance system match a previously identified (e.g., at checkout) mobile surveillance system and may optionally only become available after a second check that a proper officer has authenticated to the mobile surveillance system. That is, both the portable recording device is associated with a proper surveillance system (e.g. 301 ), and the authenticated user will be validated as a proper user.
  • Officer “Joe Smith” is authenticated to the mobile surveillance system and the mobile surveillance system is the one in patrol car 54.
  • the surveillance system in patrol car 54 is the system which Officer Smith should be using for his shift. Accordingly, prior to allowing any access to the storage drive of the portable recording device from the mobile surveillance system both attributes should be verified. Such increased authentication methods may assist in compliance with chain of custody of evidence requirements for gathering and maintenance of evidence. Note that some law enforcement agencies require a two-factor authentication for access to data. Validating “checkout information” regarding both the portable device and the authenticated officer (e.g., both the association with the surveillance system of patrol car 54 and confirming Officer Smith is logged into that system) is one example of two-factor authentication.
  • the mobile surveillance system records and stores evidence and surveillance data onto the storage device of the portable recording device.
  • all data recorded on the storage device may be associated with the officer for audit tracking purposes as indicated at block 545 .
  • a metadata file may be used to “mark” any recorded data with officer's ID, event type, date/time, GPS location, etc.
  • actions that may take place at the end of a shift are performed.
  • recorded data may be securely (for example, but not limited to, by data encryption) uploaded wirelessly to a back office system at the police station.
  • Securely uploaded indicates that the recorded data will be uploaded in a manner as to maintain its association with the officer and maintain chain of custody of evidence requirements as well as any other type of security regarding the wireless network, etc.
  • the officer may remove (e.g., disconnect) the portable recording device (e.g., 450 ) and relocate the portable recording device to the same or a different docking station 400 for upload at the police station.
  • the officer may “check in” the portable recording device so as to allow a different officer to use it on a subsequent shift. For example, checking in may be performed using a GUI interface to docking station 400 .
  • a hybrid model for storing and analyzing information may be beneficial for small and large law-enforcement agencies. Law-enforcement agencies with limited staffing and resources may find it difficult to adopt in-car or wearable video system technologies that involve complex, expensive and cumbersome components. For example, an in-house server based solution may require experienced computer technicians/specialists to maintain proper hardware operations. A non-server based solution may also be challenging because it may lack the functions such as system configuration, video search and storage management, and evidence life-cycle maintenance. It is contemplated that a cloud based SaaS solution may offer the proper flexibility and convenience required for such law enforcement agencies. Additionally, the disclosed hybrid model for storing metadata independently from actual multi-media files may more effectively work for agencies having limited bandwidth capabilities.
  • a remote application and database server may be hosted by a software as a service (SaaS) cloud application to reduce (or eliminate) the need to hire additional computer technicians.
  • SaaS software as a service
  • Some disclosed embodiments may be implemented in a hybrid cloud and provide local (on site) data storage for portions of data that require high bandwidth across a network (e.g., Internet, police network) while maintaining metadata in the cloud. This configuration may help ensure security and integrity of digital evidentiary data by maintaining a single global copy of metadata in the cloud (for storage) while still allowing fast local access speeds for review of potentially large video/audio files.
  • data on a shared server may be downloaded to the local data storage site as backup data and then re-uploaded to a remote (or cloud based) site if there is a systems failure or “intrusion” attack at the remote (or cloud based site).
  • the user may auto upload all data and metadata to the cloud.
  • a user may provide (or user event tags may be used as) identification criteria for certain types of videos (and their metadata) to be sent to the cloud automatically as soon as the videos are uploaded to a server (or staged on docking station 400 ) with certain “event type” metadata.
  • an administrator may define: all DUI videos are sent to cloud based storage and 2 DVD copies are burned.
  • an officer tags a video as a DUI event type as soon as the video is uploaded to the cloud, the video may also be sent to a DVD burner for 2 copies automatically.
  • an email may be automatically generated and sent or instructions may be provided to an employee to create and send an email with a time limited access link to personnel or third parties (e.g., prosecuting attorney) who may have an interest in a DUI event.
  • third parties e.g., prosecuting attorney
  • a wide number of triggers and follow-on responses may be generated automatically.
  • actions relating to compliance with record retention policies may be automatically generated so that as specific retention periods pass, records are automatically deleted.
  • the user may readily and easily take advantage of cloud-based storage for an almost limitless cataloguing and archiving device.
  • the SaaS component may be a system which typically includes a web-based portal that is the entry point to the software services for all users requiring data access. As with other data access points, access may be controlled by authentication means such as, but not limited to, passwords, fingerprints, encryption, and so on.
  • Authorized users may search media catalogues which may be generated from metadata obtained from a single agency or from multiple jurisdictional agencies. Users may also manage all the configuration settings of mobile/portable video/audio recording devices via a cloud based control portal. Having metadata in the cloud facilitates many different functions, such as, query search of metadata associated with audio, video or print media.
  • the metadata in the cloud and an associated interface portal may allow access to any evidentiary logs associated with the data (local or cloud based) and access a user's local hardware/software storage to review media that may not have been uploaded to cloud storage (e.g., because of bandwidth/storage constraints). That is, the cloud based system may include enough information to allow secure access back to local storage (e.g., 644 and 642 ) so that a user at police station 640 may efficiently view locally stored multi-media files. Alternatively, a user located remotely from police station 640 may obtain access (e.g., secure access via virtual private network VPN) to network and storage infrastructure at police station 640 and perform desired actions on multi-media files. Of course, bandwidth constraints of the obtained remote access (e.g., VPN) may have an effect on what actions a remote user decides to perform.
  • VPN virtual private network
  • metadata relating to GIS information and applications for performing data analysis may reside in the cloud, while the related audio/video files remain at the user's facility. This is largely based on the size of the files and recognition that bandwidth to cloud storage may affect access to large files. However, in some situations bandwidth concerns are not a determining factor and other segmentation of data may be desired. In the case where hybrid storage is implemented and a user has local access to large files, a user may more efficiently interact with metadata in the cloud and local multi-media files.
  • a cloud-based 630 video export and access system does away with the hardware and ongoing maintenance costs of optical media based systems by providing users a secure, controlled, reliable and cost-effective method for sending video and data to third parties.
  • Video and data may be uploaded to the cloud for storage, one or more third party recipients may be assigned access rights, and a defined expiration date for third party access may also be provided. Additionally, use of the cloud may permit real-time data upload and storage which provides nearly limitless data storage capacity for integrated system 100 ( FIGS. 1A and 1B ).
  • Hybrid storage models may be implemented to define pre-requisites as to what actual multi-media files are stored in the cloud. In some embodiments, only multi-media files requiring access by third parties are uploaded to the cloud.
  • Exported data may be stored in cloud-based storage that is remotely accessible through a secured means (for example, but not limited to, a password, finger print reader, etc).
  • the system may be configured to send one or more recipients an access link through automated communication methods such as email, text, and mms, etc.
  • the link sent to each recipient may include an expiration date for accessing the associated data.
  • the system may also allow a recipient of the link to review the data stored in the cloud via the Internet, download a local copy of the data for future use, and delete the data after review or download.
  • the link sent to each recipient may also limit access rights of recipients (e.g. read only, data editing, deletion, etc.).
  • FIG. 6 also graphically illustrates an example data exchange flow in block diagram 600 , thorough which video, audio, and print data and associated metadata may be shared.
  • Numerous users, computer-based functionalities, storage options, and associated lines of communication may be involved in data uploading and downloading.
  • one or several police vehicles 610 may transmit video and audio data and associated metadata via wireless communication means 605 to a cloud storage system 630 . Concurrently (or as needed), this data or a subset of this data may be made accessible to software applications, for example SaaS functions 620 , via communication link 606 .
  • police vehicle(s) 610 may also manually download data and metadata to local storage 644 upon arrival at police station 640 using data transmission channel 660 .
  • Data transmission channel 660 may be a wired connection or a wireless connection.
  • a classical “sneakernet” may be used by connecting a portable recording device to another device (e.g., docking station 400 ). After connection data may be uploaded to local storage 644 , which is located at the police station, and then optionally (based on a number of different criteria) to the cloud 630 using any appropriate connection (e.g., 645 , 650 or another available connection).
  • an integrated surveillance system vendor 670 oversees and maintains SaaS functions 620 utilizing communication channel 665 .
  • the vendor may also optionally maintain the security and integrity of any cloud based storage system 630 utilizing communication channel 666 .
  • Vendor 670 may also provide all necessary technical support through its SaaS functions 620 and communication channel 645 to assist police station 640 in implementing best practices in the preservation of data evidence.
  • police station 640 depending on available resources, may have “in-house” routers (not shown) and surveillance system backend server(s) 642 which provide redundant data storage systems.
  • police station 640 in order to avoid expensive data storage solutions, may optionally utilize cloud storage 630 via communication channel 650 in a hybrid manner.
  • Cloud storage system 630 may also communicate directly with SaaS functions through communications channel 655 . Having multiple channels of secured communications may provide rapid and efficient data exchange while use of various storage means, (locally or cloud-based) allows an inexpensive and flexible alternative to resource-limited users.
  • flow chart 700 illustrates a potential data mining strategy for captured data.
  • the disclosed data mining strategy may benefit from the above discussed hybrid storage model in a number of ways.
  • Example benefits across a single agency or multiple jurisdictionally distinct entities may include sharing of information without violating privacy or other data access concerns.
  • Sample use cases are described following this overview of flow chart 700 .
  • at least one surveillance system automatically captures video and audio data and associates that captured data with GPS positioning, timestamp, and other information captured as metadata while the vehicle containing the surveillance system is “on-patrol”.
  • All such video and audio data (including metadata) from a single or multiple “on-patrol” vehicles at block 710 may be uploaded to a central storage area (e.g., a cloud) at the end of a law enforcement personnel shift.
  • a central storage area e.g., a cloud
  • the captured data segment for untagged capture may be stored in an area of a computer hard drive with continuous loop function such that oldest data is overwritten by newer data.
  • a retention criteria based on the tag type may be applied and the appropriate data stored with other tagged data as illustrated by block 720 .
  • any necessary evidentiary access controls as illustrated by block 720 may be considered.
  • data mining is performed. Information gathered from the data mining function may be used to provide a global index of data (e.g., index of data across all available metadata). Indexed data remains available until the time limit for data retention is reached and then data (and its associated index information) may be expunged.
  • an unpredicted event occurs, for example, a bombing, terrorist activity, report of previous criminal activity etc.
  • the data mined in block 730 may be retrieved to assist with investigation and evidence gathering.
  • the overall system may be configured to proactively apply analytics to the captured metadata to identify possible criminal activity or potential threats to public health and safety (e.g., face or pattern recognition analysis to identify a known criminal or threat). Such analysis may then be used to produce an analytics report as is shown at block 750 .
  • the analytics report for example, may then be reviewed by law enforcement personnel to assist with an investigation or determine if further investigation is required.
  • Collecting metadata from multiple surveillance systems to create a comprehensive index may allow a law-enforcement agency to correlate information from different systems. For example, if a set of recordings from different patrol cars at a given geographical location are of interest, then the metadata containing GPS information may identify a subset of multi-media files that may be of interest. If multiple agencies use a common global index, they may be made aware of recordings that other agencies have obtained that would otherwise be unknown to them. After following appropriate legal procedures, they may obtain access to recordings from other agencies to assist in gathering evidence. Note that access to actual multi-media recordings may not be made available because of privacy concerns, for example, but the global index informs of the existence of potentially relevant information. In this manner, coordinated inter-agency information sharing may be enhanced.
  • the hybrid storage model facilitates creation of a global index because the overall size of actual multi-media recordings across numerous surveillance devices may quickly become unmanageable. Additionally, chain of custody of evidence and access controls to actual multi-media files may be maintained.
  • Hybrid techniques may also be implemented as a sliding scale. That is, at one extreme a maximal hybrid technique uploads all (or nearly all) captured metadata and associated multi-media files. For example, a large police station with a big cloud presence and high bandwidth might use the maximal hybrid model. In another extreme, a minimal hybrid technique would upload only enough metadata for indexing and very few (if any) multi-media files. The minimal amount of metadata may allow for global indexing as necessary so that, when required, additional upload of data may be requested from the agency implementing the minimal hybrid model.
  • FIG. 8A illustrates a video metadata file for a captured video segment while FIG. 8F illustrates an XML segment that contains the VX-Sync data.
  • the “V” of “VX” is used to reference the particular video and the “X” to reference any event variable associated with the particular video.
  • any action taken by a user such as an action that triggered the recording, e.g. pushing a wireless microphone, or activation of a light bar, or the taking of a snapshot, will be recorded in this metadata file and associated with the variable X for future connection with the video V.
  • FIG. 8B illustrates a sample metadata file with attributes and values relating to in-car activity logs based on personnel shifts
  • FIG. 8C illustrates a file portion that relates to GPS location metadata based on personnel shifts
  • FIG. 8D illustrates in-car error logs per personnel shift.
  • FIG. 8E illustrates an example metadata file that may be used to establish an audit trail for the associated video to satisfy evidentiary requirements relating to chain of custody.
  • FIG. 8C illustrates a series of “collected” metadata elements where several attributes have been assigned values based on data collection.
  • the attribute “patrol unit” has a value to identify a particular police vehicle and the officerID attribute has a value corresponding to the identification of a specific officer.
  • officerID may be initially blank as in elements 820 and 825 , and then be assigned an ID number as an officer logs onto (e.g., successfully authenticates to) the integrated system 100 ( FIGS. 1A and 1B ) as in shown in element 830 .
  • Another attribute may be “log time” (element 835 , FIG. 8E ) which is the date and time that a data record is captured.
  • Yet other attributes which are self explanatory based on their name, may indicate changes in longitude and latitude reflective of the vehicle that is in motion. In addition the speed and velocity of the vehicle in motion may be reflected in the metadata.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Library & Information Science (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biomedical Technology (AREA)
  • Computer Graphics (AREA)
  • Power Engineering (AREA)
  • Technology Law (AREA)
  • Television Signal Processing For Recording (AREA)
  • Storage Device Security (AREA)

Abstract

Devices and methods for managing multi-media files and associated metadata in a hybrid manner are disclosed. Methods for using the device(s) to implement different methods for managing information obtained (e.g., recorded) by a plurality of recording devices are also disclosed. This disclosure also relates to comprehensive use of multiple distinct surveillance systems in a coordinated manner. For example, a set of surveillance devices configured for use by one or more law enforcement agencies or other government agencies may share metadata to facilitate indexing, sharing, accessing, and coordinating potential surveillance recordings. In one example, metadata may be uploaded to cloud storage while associated multi-media files are maintained locally by the responsible agency. Maintaining metadata and actual multi-media content separately may reduce bandwidth transmission requirements and maintain confidentiality of surveillance recordings. Further, chain of custody of evidence requirements regarding digitally recorded evidence may be complied with.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of, and priority to, U.S. Provisional Application No. 62/044,139, filed Aug. 29, 2014, and entitled, “Compact Multi-Function DVR with Multiple Integrated Wireless Data Communication Devices,” which is incorporated herein by reference.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH
  • Not applicable.
  • FIELD OF THE INVENTION
  • This disclosure relates generally to systems and methods to assist in managing information including both multi-media and associated metadata obtained (e.g., recorded) by a recording device. More particularly, but not by way of limitation, this disclosure relates to systems and methods for maintaining large multi-media files on local storage and associated metadata files on remote (e.g., cloud based) storage to facilitate searching, cataloging, indexing, audit tracking, accessibility, and other maintenance functions without requiring upload of large volumes of data.
  • BACKGROUND
  • Today's law enforcement agencies are increasing their use of digital data to collect surveillance information and other forms of data to be used as evidence in legal proceedings. Devices and methods for managing multi-media files collected as part of this surveillance and evidence collection are increasing over time. Multi-media files may be large. For example, a video or audio file may easily be megabytes in size depending on the length of the recording. Video files are typically larger than audio files and they become larger based on the resolution of the video recording. That is, higher resolution video files typically require larger file sizes than either audio or lower resolution video because of current audio and video compression techniques. Video files are also typically larger than corresponding audio files because they include more data than an audio recording.
  • Metadata associated with either audio recordings or video recordings is a relatively small amount of data compared to the audio or video data. However, today's systems typically embed the metadata as part of the audio or video data file such that access to the metadata requires access to the potentially large multi-media file. Also, most access programs require an entire file to understand the structure and content of the file itself. Accordingly, to access any metadata associated with a typical multi-media file, one must have complete access to the entire multi-media file.
  • SUMMARY
  • According to a first aspect of the invention, a computer system configured to collect and manage metadata associated with one or more multi-media recordings is disclosed. The computer system includes one or more processors and one or more network communication interfaces communicatively coupled to the one or more processors. The computer system also includes a storage area accessible to the one or more processors. The storage area may be used to store executable instructions for the processor(s) and to store any collected (e.g., recorded) information. Of course, these two types of data may be stored in separate logical areas of the storage area. Overall, the computer system may be configured, by the executable instructions, to receive, via the one or more network communication interfaces, metadata information pertaining to at least one multi-media recording. The metadata information may include information regarding attributes describing recording circumstances for the at least one multi-media recording and an access location for the at least one multi-media recording. The attributes describing recording circumstances will generally provide information about when, where, why, and possibly how the recording was made. This information about recording circumstances may be helpful to determine which recordings may be of interest for a given activity or search query.
  • In a second aspect of this disclosure, the computer system (or a separate computer system) may be further configured to process the metadata for the at least one multi-media recording to incorporate information into a global index or catalog of additional multi-media recordings. The additional multi-media recordings may be obtained from the same or a plurality of distinct capture devices. The overall global index may be useful to respond to query requests to identify potentially applicable multi-media recordings.
  • In a third aspect of this disclosure a method of managing a plurality of multi-media recordings is disclosed. The method may include receiving first metadata information having information regarding attributes describing recording circumstances attributable to a first multi-media recording obtained by a first recording device. The first metadata information may be stored in an associated external file rather than embedded into the multi-media recordings. The metadata may be correlated with other information about additional multi-media recordings. Overall, the metadata may be managed independently of the recordings and provide location information (e.g., storage location) for selected multi-media files. A user interface may be provided to allow query type functions to interface with the correlated information to identify potentially relevant recordings based on a query request.
  • In a fourth aspect of this disclosure, a docking station is disclosed. The docking station may be configured to manage the multi-media recordings and assist with overall management of multi-media recordings as discussed throughout this disclosure. The docking station may be configured to automate and possibly prioritize some or all of the disclosed management functions.
  • Other aspects of the embodiments described herein will become apparent from the following description and the accompanying drawings, illustrating the principles of the embodiments by way of example only.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • It being understood that the figures presented herein should not be deemed to limit or define the subject matter claimed herein, the applicants' disclosure may be understood by reference to the following description taken in conjunction with the accompanying drawings, in which like reference numerals identify like elements.
  • FIGS. 1A-B illustrate a rear view and a front view, respectively, of a device for capturing (e.g., recording) multi-media and metadata according to some disclosed embodiments.
  • FIGS. 2A-C illustrates block diagrams of a processing system and two example removable storage devices that may be used for the disclosed integrated mobile surveillance system to capture and store multi-media files and associated metadata according to some disclosed embodiments.
  • FIG. 3 illustrates a block system diagram showing some additional internal components for the device of FIGS. 1A-B, according to some disclosed embodiments.
  • FIG. 4 illustrates an intelligent docking, upload, and charging station for battery packs and portable recording devices according to some disclosed embodiments.
  • FIG. 5 illustrates a possible process flow to “checkout” a portable device (e.g., body worn camera, wireless microphone), including a storage device, that may be used by specific law enforcement personnel for the duration of checkout and assist in chain of custody procedures according to some disclosed embodiments.
  • FIG. 6 illustrates possible data flow and Software as a Service (SaaS) components for working with information stored in a “hybrid” manner according to some disclosed embodiments.
  • FIG. 7 illustrates a flow chart depicting one possible process for data mining of information collected by a plurality of surveillance systems according to some disclosed embodiments.
  • FIGS. 8A-F illustrate excerpts of metadata files using eXtensible Markup Language (XML) for the data format, according to some disclosed embodiments.
  • NOTATION AND NOMENCLATURE
  • Certain terms are used throughout the following description and claims to refer to particular system components and configurations. As one skilled in the art will appreciate, the same component may be referred to by different names. This document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . . ” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection, or through an indirect connection via other devices and connections.
  • As used throughout this disclosure the terms “computer device” and “computer system” will both be used to refer to an apparatus that may be used in conjunction with disclosed embodiments of connectable storage drives and self-contained removable storage devices. As used herein, a computer device may be thought of as having a subset of functionalities as compared to a computer system. That is, a computer device may refer to a special purpose processor-based device such as a digital video surveillance system primarily configured for executing a limited number of applications. A computer system may more generally refer to a general purpose computer such as a laptop, workstation, or server which may be configured by a user to run any number of off the shelf or specially designed software applications. Computer systems and computer devices will generally interact with disclosed storage drives included in embodiments of the disclosed portable recording device in the same or similar ways.
  • The term “hybrid storage” is used in this disclosure to describe that data associated with accessing and managing multi-media files may be stored in a plurality of locations as opposed to a single location and not embedded within the multi-media file itself. For example, metadata files containing attributes of associated multi-media files, and/or data collected or maintained in association with multi-media files, may be stored remotely from the multi-media files themselves. Metadata files are typically considerably smaller in size than multi-media files. Thus, metadata files are more easily transferred across data links that may have limited bandwidth. As explained further below, hybrid storage may allow for searching and indexing of numerous multi-media files without requiring unnecessary transfer of the potentially large multi-media files (e.g., video/audio recordings). For simplicity the term “multi-media” will be used throughout this disclosure to refer to files collected (e.g., recorded) by an audio or audio/video recorder. Multi-media files may include only audio, only video, or audio and video together and the information may be compressed using an industry standard compression technology (e.g., Motion Picture Expert Group (MPEG) standards, Audio Video Interleave (AVI), etc.) or another proprietary compression or storage format.
  • The term “recording circumstances” is used herein to describe that metadata information associated with an instance of a multi-media recording may contain information describing attributes associated with the act of actual recording of that multi-media file. That is, the metadata may describe who (e.g., Officer ID) or what (e.g., automatic trigger) initiated the recording. The metadata may also describe where the recording was made. For example, location may be obtained using global positioning system (GPS) information. The metadata may also describe why (e.g., event tag) the multi-media recording was made. In addition, the metadata may also describe when the recording was made using timestamp information obtained in association with GPS information or from an internal clock, for example. From these types of metadata, circumstances that caused the multi-media recording may provide more information about the multi-media recording. This metadata may include useful information to correlate multi-media recordings from multiple distinct surveillance systems. This type of correlation information, as described further below, may assist in many different functions (e.g., query, data retention, chain of custody, and so on).
  • This disclosure also refers to storage devices and storage drives interchangeably. In general, a storage device/drive represents a medium accessible by a computer to store data and executable instructions. Also, throughout this disclosure reference will be made to “plugging in” a storage drive. It is noted that “plugging in” a storage drive is just one way to connect a storage drive to a computer device/system. This disclosure is not intended to be limited to drives that physically “plug in” and disclosed embodiments are also applicable to devices that are “connected” to a computer device or computer system. For example devices may be connected by using a cable or by connecting using a computer bus. Additionally, references to “removable” storage are analogous to plugging-in/unplugging a device, connecting/disconnecting cabled access to a device, and/or establishing/disconnecting networked access to a device or storage area on a network (either wired or wireless).
  • DETAILED DESCRIPTION OF DISCLOSED EMBODIMENTS
  • While various embodiments are described herein, it should be appreciated that the present disclosure encompasses many inventive concepts that may be embodied in a wide variety of contexts. Thus, the following detailed description of exemplary embodiments, read in conjunction with the accompanying drawings, is merely illustrative and is not to be taken as limiting the scope of this disclosure. Rather, the scope of the invention is defined by the appended claims and equivalents thereof.
  • Illustrative embodiments of this disclosure are described below. In the interest of clarity, not all features of an actual implementation are described for every embodiment disclosed in this specification. In the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the design-specific goals, which will vary from one implementation to another. It will be appreciated that such a development effort, while possibly complex and time-consuming, would nevertheless be a routine undertaking for persons of ordinary skill in the art having the benefit of this disclosure.
  • Embodiments of the present disclosure provide for management of multi-media files and associated metadata that might be collected by one or more, mobile surveillance systems, portable video recording devices, and other types of data recorders. The mobile (and possibly stationary) surveillance system devices may be configured to capture video, audio, and data parameters pertaining to activity in the vicinity of the surveillance system, for example a police vehicle. Other type of vehicles and other situations requiring a surveillance unit are also within the scope of this disclosure. Other types of vehicles may include, but are not limited to, any transportation means equipped with a mobile surveillance system (e.g., civilian transport trucks). The disclosed embodiments are explained in the context of mobile surveillance systems for vehicles that aid in law enforcement such as buses, ambulances, police motorcycles or bicycles, fire trucks, airplanes, boats, military vehicles, etc. However, in some embodiments, data collected from other types of vehicles including non law enforcement vehicles may be collected as a possible aid to law enforcement (or for other applicable uses), at least in part, because of the disclosed data mining and coordination techniques.
  • Mobile surveillance systems have been in use by police departments for the past few decades. Over that period of time, several advances have been introduced in the technology used to provide video/audio and data regarding specific police events. In the late 1990s through the early 2000s, digital technologies became prevalent in the industry, replacing existing analog technologies. With the use of digital technologies, law enforcement agencies obtained several advances over previous technologies and may further benefit from additional advances (e.g., as described in this disclosure). In general, digital technologies are more adaptable and offer more opportunities for improvement than corresponding analog technologies. This is largely because digital video/audio files can be processed in a multitude of ways by specifically configured computer devices. This disclosure elaborates on several novel techniques to enhance the capability, reliability, ease of use, security, integrity, and other aspects of mobile surveillance systems and the information they collect.
  • Today, there are numerous surveillance systems in use by law enforcement and the data they collect continues to increase in volume and complexity. Accordingly, enhanced management techniques for the amount of available data may be required. Additionally, there is a need to improve data access and distribution, integrity, reliability, and security throughout the lifecycle of that data. Legal requirements for data collected by a remote/mobile surveillance system include conformance to judiciary requirements such as “chain of custody/evidence,” and “preservation of evidence.” Chain of custody (CoC), in legal contexts, refers to the chronological documentation or paper trail audit, showing the seizure, custody, control, transfer, analysis, and disposition of physical or electronic evidence. Preservation of evidence is a closely related concept that refers to maintaining and securing evidence from a particular crime scene before it ultimately appears in a courtroom. For example, the evidence may go to a forensic laboratory prior to arriving at the courtroom. Evidence admissibility in court is predicated upon an unbroken chain of custody. It is important to demonstrate that the evidence introduced at trial is the same evidence collected at the crime scene [e.g. that is, all access to the evidence (e.g., electronic files) was controlled and documented], and that the evidence was not altered in any way. Requirements for law enforcement are further described in “Criminal Justice Information Services (CJIS) Security Policy,” version 5.3 published Aug. 4, 2014 referenced as “CJISD-ITS-DOC-08140-5.3” which is hereby incorporated by reference in its entirety.
  • As will be recognized, disclosed embodiments may allow for comprehensive back-office video management software to be provided using a Software as a Service (SaaS) architecture, giving each agency (even small remote agencies) the tools they need to capture, transfer, store and manage their digital video evidence from car to court. That is, the disclosed system and back-office management techniques meet the preservation of evidence requirements outlined above with respect to management of digital evidence for law enforcement. All activity with respect to digital evidence in the back-office system may be logged to ensure proper documentation of evidence handling. The disclosed system may include electronic transfer of evidence in a controlled manner and may provide comprehensive coordination of potential evidence captured from a plurality of surveillance systems. The disclosed system may also include integrated DVD burning software for easy and accurate evidence transfer.
  • Referring now to FIGS. 1A-B, disclosed embodiments of an integrated mobile surveillance system 100 are intended to incorporate a plurality of functions as being “built-in” to mobile surveillance system 100. Additionally, aspects of integrated mobile surveillance system 100 have been designed with consideration for future expansion as new technologies and capabilities become available. Aspects of integrated system 100 include, but are not limited to, the following integrated functional units. Integrated system 100 may be configured to have one or more than one of each of these functional units, as appropriate. Integrated wireless microphone antenna connector 105 allows capture of audio from a remote wireless microphone located within proximity of integrated system 100. An external multi-conductor interface cable 110 allows a wired connection to one or more internal interfaces of integrated system 100. One or more Universal serial bus (USB) ports, such as USB port 140, may be provided for general peripheral connectivity and expansion according to some disclosed embodiments. An integrated global positioning system (GPS) module 120 with optional external antenna or connector 115 is used in part for capturing location data, time sync, and speed logging. The GPS information may also be used for time synchronization and to coordinate data, ultimately facilitating map based search and synchronization (e.g., locate recorded information from a time and/or location across a plurality of recording devices). Dual front facing cameras 125 may include both a wide angle video camera and a tight field of view camera for optical zoom effect snap shots. A record indicator 130 provides an indication of a current operating mode for integrated system 100. A wired Ethernet adapter (e.g., Gigabit, 10/100 BASE-T, etc.) 135 (or a wireless network adapter, not shown) for data upload, computer interface, remote display and configuration. Additionally, multiple wireless data communication devices (not shown) may be integrated for flexibility and expansion. For example, the system may include adapters conforming to wireless communication specifications and technologies such as, 802.11, Bluetooth, radio-frequency identification (RFID), and near field communication (NFC). Each of these interfaces may be used, at least in part, for data exchange, device authentication, and device control. A serial port (not shown) may be used to interface with radar/laser speed detection devices and other devices as needed. A G-Sensor/Accelerometer (not shown) may be used for impact detection and to automatically initiate record mode. The G-Sensor/Accelerometer may also provide data logging for impact statistics and road condition data. A DIO (Digital Input/Output) (not shown) that may be used for external triggers to activate record mode and/or provide metadata to the system. The DIO can also be used to control external relays or other devices as appropriate. The DIO can also be used to detect brake, light bar, car door, and gun lock so that the video recording can be automatically triggered. As shown in FIGS. 1A-B, a combination power button and brightness control 145 can be used to turn on the system and control the brightness of the monitor after the system is turned on. Programmable function button 150 provides a user definable external button for easy access to instigate any function provided by integrated system 100. For example, rather than traversing through a set of menus on articulating touchscreen 165, a user may define function button 150 to perform an action with one touch (e.g., instant replay, event tagging of a particular type, etc.). A articulating touchscreen 165 may be used to view video in real-time, or in one or more play back modes. Touchscreen 165 may also serve as an input mechanism, providing a user interface to integrated system 100. An integrated speaker (not shown) may be used for in-car audio monitoring and in-car video/audio file playback. An integrated internal battery 155 for proper shutdown in the event of sudden power loss from the vehicle that might occur as a result of a crash, for example, is shown. Also depicted is a removable SSD Flash drive 170 (e.g., secure digital (SD) or universal serial bus (USB) type), including any type of storage that may be inserted or attached to the system via a storage interface (e.g., SCSI, SATA, etc.). For security of access to data, removable SSD flash drive 170 may be secured via a mechanical removable media key lock 160. In some embodiments, event based data is recorded and written to the removable drive to be transferred to a back office server for storage and management. Wireless microphone sync contacts 175 may be configured to synchronize a wireless microphone/camera, such as a body worn camera and microphone, for communication with integrated system 100. In addition to actual sync contacts, that require physical contact, other synchronization methods for wireless microphone/cameras include utilizing NFC or RFID capability between the wireless device and integrated system 100.
  • In addition to the components mentioned above, disclosed embodiments of integrated mobile surveillance system 100 may be configured to include functional components to provide operational characteristics that may include the following. In accordance with some embodiments, a pre-event playback function may be used to tag historical events. In normal operation, integrated mobile surveillance system 100 may record continuously to internal storage and store tagged information (e.g., marked for export) to removable storage. However, for the case of an incident that occurs without a timely event trigger, the operator may instruct the system to navigate back to an earlier time captured in the internal storage and play back that portion of video/audio information. The selected video, at any available point in time, may be marked, tagged for extraction, and stored to removable storage, as if the event had been tagged at that point in time. In accordance with some other embodiments, a component may provide an instant replay function configured to playback the last predetermined amount of time with one button press. Note that both the instant replay and pre-event playback (along with general system operation) allow for simultaneous playback while the system is concurrently recording information. Pre-defined event tags and a pre-defined event tagging functions may also be provided. For example, tags may include DWI, felony, speeding, stop sign, chase, etc. The tagging action may be used to catalog portions of recorded data. For example, after an event is indicated as ending (e.g., such as stop recording indication), an option to select a predefined event may be displayed. Upon selection the system may allow an associated portion of collected information to be marked in a text file for current and future identification and storage. Further, when the tagged information is transferred to the data management software, the tagged information may be searched by event type and maintained on the server for a predefined retention period based on the event type. A streaming function may also be provided to stream live view and recorded video, audio, and/or data over available wireless and wired networks. The integrated system 100 may also integrate “hotspot” capabilities which allow the system to serve as an agency accessible, mobile wireless local area network (WLAN).
  • Referring now to FIGS. 2A-C, possible internals and peripheral components of an example device 200, which may be used to practice the disclosed functional capabilities of integrated surveillance system 100, are shown. Example device 200 comprises a programmable control device 210 which may be optionally connected to input device 260 (e.g., keyboard, mouse, touchscreen, etc.), display 270 or program storage device 280. Also, included with programmable control device 210 is a network interface 240 for communication via a network with other computers and infrastructure devices (not shown). Note network interface 240 may be included within programmable control device 210 or be external to programmable control device 210. In either case, programmable control device 210 may be communicatively coupled to network interface 240. Also, note Program Storage Device (PSD) 280 represents any form of non-volatile storage including, but not limited to, all forms of optical and magnetic storage elements including solid-state storage.
  • Program control device 210 may be included in a device 200 and be programmed to perform methods, including hybrid storage of metadata and associated multi-media files, in accordance with this disclosure. Program control device 210 comprises a processor unit (PU) 220, input-output (I/O) interface 250 and memory 230. Processing unit (PU) 220 may include any programmable controller device including, for example, the Intel Core®, Pentium® and Celeron® processor families from Intel and the Cortex ARM processor families from ARM® (INTEL® CORE®, PENTIUM® and CELERON® are registered trademarks of the Intel Corporation. CORTEX® is a registered trademark of the ARM Limited Corporation. ARM® is a registered trademark of the ARM Limited Company). Memory 230 may include one or more memory modules and comprise random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), programmable read-write memory, and solid state memory. One of ordinary skill in the art will also recognize that PU 220 may also include some internal memory including, for example, cache memory.
  • Various changes in the materials, components, circuit elements, as well as in the details of the illustrated systems, devices and below described operational methods are possible without departing from the scope of the claims herein. For instance, acts in accordance with disclosed functional capabilities may be performed by a programmable control device executing instructions organized into one or more modules (comprised of computer program code or instructions). A programmable control device may be a single computer processor (e.g., PU 220), a plurality of computer processors coupled by a communications link or one or more special purpose processors (e.g., a digital signal processor or DSP). Such a programmable control device may be one element in a larger data processing system such as a general purpose computer system. Storage media, as embodied in storage devices such as PSD 280 and memory internal to program control device 210 are suitable for tangibly embodying computer program instructions. Storage media may include, but not be limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (DVDs); and semiconductor memory devices such as Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Gate Arrays and flash devices. These types of storage media are also sometimes referred to as computer readable medium or program storage devices.
  • FIG. 2B illustrates a secure digital (SD) card 285 that may be configured as the programmable storage device described above. An SD card is a nonvolatile memory card format for use in portable devices, such as mobile phones, digital cameras, handheld consoles, and tablet computers, etc. An SD card may be inserted into a receptacle on the device conforming to the SD specification or may alternately be configured with an interface to allow plugging into a standard USB port (or other port). An example of the adapter for USB compatibility 286 is illustrated in FIG. 2C. Modern computer operating systems are typically configured to automatically permit access to an SD card when it is plugged into an active computer system (sometimes referred to as plug-n-play). In computing technologies, a plug and play device or computer bus is one with a specification that provides for or facilitates the discovery of a hardware component in a system without the need for physical device configuration or user intervention in resolving resource conflicts. Because of additional security requirements regarding data access with respect to the law enforcement field, disclosed systems may incorporate a specifically modified interface to the removable storage drive utilized in device 100 (i.e., removable media 170). Modifications permitting specialized access to removable media, such as a secure storage drive, are described in co-pending U.S. patent application Ser. No. 14/588,139, entitled “Hidden Plug-in Storage Drive for Data Integrity,” by Hung C. Chang, which is incorporated by reference herein. Modifications permitting specialized functionality from removable media are described in co-pending U.S. patent application Ser. No. 14/593,722, entitled “Self-contained Storage Device for Self-contained Application Execution,” by Allan Chen et al., which is incorporated by reference herein.
  • Referring now to FIG. 3, block diagram 300 illustrates one embodiment of an integrated audio-video-data surveillance system. Note that each of the components shown in block diagram 300 may be communicatively coupled to other components via communication channels (e.g., bus) not shown in the block diagram. The flow arrows of block diagram 300 are general in nature to illustrate the movement of information. In use, video and audio may be captured by camera 305 and microphone 306 respectively. Captured data may be provided initially to video/audio encoder 310 to encode and optionally compress the raw video data and the encoded data may be stored in a memory area (not shown) for access by CPU 315. Encoded data may also be selectively stored to either internal failsafe hard drive 320 or removable mobile hard drive 325 individually or to both simultaneously. Data may also be transferred, for example at the direction of a user, from internal failsafe hard drive 320 to removable mobile hard drive 325. Data capture devices such as general purpose input output (GPIO) 330 and GPS 331 may be used to capture metadata to associate with captured surveillance information (e.g., multi-media files). All pertinent captured metadata may be associated with captured video/audio recordings using structured text files such as, for example, eXtensible Markup Language (XML) files. An example of such structured text files is explained in more detail below with reference to FIGS. 8A-F. In addition to captured metrics provided by real-time capture inputs, XML files may be utilized to store many different types of metadata associated with captured video and data. This collection of metadata may be used to describe “recording circumstances” attributable to the surveillance information (e.g., multi-media recordings). That is, the metadata may describe, when, where, who, and why information, among other things, to indicate information about the act of recording the surveillance information. The metadata may include, but not be limited to, timestamps of capture, [internal clock (not shown) of system 100 may be synchronized using GPS data] event tags, GPS coordinates, GPS and RADAR/LIDAR measurement from a target vehicle, breathalyzer analysis information, analytical information, and so on. Analytical information will be discussed in more detail below with reference to FIG. 7. Wireless interface 335 (or a wired interface (not shown) when available) may be used to upload information from one or more surveillance systems to back office servers located, for example, at a police station or to cloud based resources. Back office servers and cloud based resources will be discussed in more detail below with reference to FIG. 6.
  • Referring now to FIG. 4, advanced docking station 400 may provide additional benefits for users that maintain a plurality of portable body worn cameras 450 and/or a plurality of surveillance systems. Some or all portable body worn cameras 450 may incorporate one or more programmable function buttons 405. As shown in FIG. 4, docking station 400 may have multiple ports/cradles 415. Docking station 400 may assist in data upload, device checkout, device upgrade (e.g., firmware/software update), recharging of battery packs 420 and other maintenance type functions that may be performed, for example, at a police station. For clarity, not all repeated elements in FIG. 4 have an associated reference number. Embodiments of the disclosed docking station may support maintenance functions for multiple portable devices such as body worn cameras 450 concurrently. The disclosed docking station 400 may be multifunctional for uploading and/or downloading of video/audio and associated metadata. Configuration data such as unit ID, user ID, operational modes, updates, and so on, may be maintained and versions of such configuration information may be presented on display screen 410 (which may also be a touchscreen interface to docking station 400).
  • Docking station 400 may have integrated interfaces to different types of surveillance systems. Interfaces such as, USB, wired Ethernet or wireless network, as well as interface ports for battery charging may be included. Docking station 400 may also contain: a CPU and be configured as a computer device (see FIG. 1) with optional integrated touchscreen display 410, output connectors (not shown) for an optional external display/mouse or device expansion. Docking station 400 may have an option for a wireless display (not shown) to be used for status indication as well as for an interface for checkout/assignment of surveillance system devices to a user or group of users (See FIG. 5). Docking station 400 may include wireless communications such as Bluetooth and/or 802.4AC/AD. Docking station 400 may also be configured to work as an Access Point for a wireless network or may be configured to act as a bridge to allow portable client devices to access functionality of docking station 400 and possibly connect to other system components including local or cloud based servers. Docking station 400 may also include functional software or firmware modules to support hybrid storage of recorded multi-media and associated metadata automatically. Hybrid storage is discussed in more detail below with reference to FIG. 7.
  • Docking station 400 may also have an internal storage device to facilitate fast off-load storage which may be used to facilitate a download/forward process for audio/video and metadata captured on a surveillance system device (e.g. the body worn camera 450). For example, the user may place the body worn camera 450 into a docking station cradle 415 and docking station 400 offloads the data to the local onboard storage drive (not shown) which can immediately (or based on a timer) upload that information, or a portion thereof if hybrid model, to a server (e.g., back office server or cloud storage). Uploads may be prioritized based on many different attributes such as time, size, event type priority, and so on. Docking station 400 may also have an integrated locking mechanism for one or more of the uploading/charging ports/cradles 415. The docking station 400 may be configured to control the locking mechanism to hold or release the wearable device in order to prevent the user from taking it out during uploading/downloading, or to make sure that only the recently “checked out” device is removed, for example.
  • The touchscreen display 410 of FIG. 4 illustrates one possible graphical user interface (GUI) layout as an example only. Actual layouts may contain more information and features and may be configurable based on requirements of different end users. In FIG. 4, the GUI shows examples of upload status and battery charging progress. Other screens may be available on the GUI display 410 to provide other status information such as unit ID, user ID, and/or to assist with user checkout and assignment of devices to different mobile surveillance systems.
  • Referring now to FIG. 5, process flow 500 illustrates a possible method for assisting law enforcement personnel with compliance to chain of custody of evidence requirements for legal evidence. Chain of custody of evidence requirements may be implemented with the assistance of docking station 400. In this example, the computer device at the police station is considered to be docking station 400 (but may be another workstation type device for example) and a computer device in a police car, for example, will be referred to as a “mobile surveillance system.” Both docking station 400 and the mobile surveillance system are examples embodiments of computer device 100 of FIG. 1 described above. Beginning at block 505 a portable recording apparatus (e.g., body worn camera 450) including a storage device (e.g., 285, 286) is “checked in” at a police station, for example. In the “checked in” state the portable recording device may be connected to docking station 400 that is configured to interact with the storage device of the portable recording device. At block 510, docking station 400 receives a request to assign a portable recording device (e.g., body worn camera 450, or wireless microphone) to an officer (e.g., Officer “Joe Smith”) for use in a patrol “shift.” The request may, for example, come from a GUI presented on touchscreen 410. Optionally, the request may also include information to assign the portable recording device to a particular mobile surveillance system for that shift (e.g., surveillance system of “patrol car 54”). At block 515, docking station 400 writes control information to the storage device of portable recording device to identify an appropriate mobile device (e.g., 301). The control information may include storage serial number, officer's ID (e.g., “Joe”), patrol car (e.g., “54”), officer's password (likely encrypted), recording parameter settings, or other information useful in assisting in audit tracking of the portable recording device and any information collected on the storage device of the portable recording device during the shift. At block 520, the portable recording device is removed from docking station 400 for association with a mobile surveillance system (e.g., 301). The portable recording device (e.g., 450) is now in a “checked out” state.
  • At block 525, the officer authenticates to a mobile surveillance system. The portable recording device is connected to the mobile surveillance system at block 530. Flow continues to block 535 where the storage device of the portable recording device (e.g., 450) becomes accessible to the mobile surveillance system if authentication information is accurate. Authentication may require that the mobile surveillance system match a previously identified (e.g., at checkout) mobile surveillance system and may optionally only become available after a second check that a proper officer has authenticated to the mobile surveillance system. That is, both the portable recording device is associated with a proper surveillance system (e.g. 301), and the authenticated user will be validated as a proper user. Thus, in this example, Officer “Joe Smith” is authenticated to the mobile surveillance system and the mobile surveillance system is the one in patrol car 54. In this example the surveillance system in patrol car 54 is the system which Officer Smith should be using for his shift. Accordingly, prior to allowing any access to the storage drive of the portable recording device from the mobile surveillance system both attributes should be verified. Such increased authentication methods may assist in compliance with chain of custody of evidence requirements for gathering and maintenance of evidence. Note that some law enforcement agencies require a two-factor authentication for access to data. Validating “checkout information” regarding both the portable device and the authenticated officer (e.g., both the association with the surveillance system of patrol car 54 and confirming Officer Smith is logged into that system) is one example of two-factor authentication.
  • At block 540, as the officer performs his shift duties (e.g., goes on patrol, etc.), the mobile surveillance system records and stores evidence and surveillance data onto the storage device of the portable recording device. During the shift, all data recorded on the storage device may be associated with the officer for audit tracking purposes as indicated at block 545. For example, a metadata file may be used to “mark” any recorded data with officer's ID, event type, date/time, GPS location, etc.
  • Next, at block 550 actions that may take place at the end of a shift, for example, are performed. After a shift is completed and the officer, mobile surveillance system, and portable recording device return to the police station, recorded data may be securely (for example, but not limited to, by data encryption) uploaded wirelessly to a back office system at the police station. Securely uploaded, as used here, indicates that the recorded data will be uploaded in a manner as to maintain its association with the officer and maintain chain of custody of evidence requirements as well as any other type of security regarding the wireless network, etc. As an alternative to wireless upload, the officer may remove (e.g., disconnect) the portable recording device (e.g., 450) and relocate the portable recording device to the same or a different docking station 400 for upload at the police station. At block 555, the officer may “check in” the portable recording device so as to allow a different officer to use it on a subsequent shift. For example, checking in may be performed using a GUI interface to docking station 400.
  • In accordance with some embodiments, the above description discloses how multi-media files and associated metadata may be collected. In accordance with other embodiments, a hybrid model for storing and analyzing information may be beneficial for small and large law-enforcement agencies. Law-enforcement agencies with limited staffing and resources may find it difficult to adopt in-car or wearable video system technologies that involve complex, expensive and cumbersome components. For example, an in-house server based solution may require experienced computer technicians/specialists to maintain proper hardware operations. A non-server based solution may also be challenging because it may lack the functions such as system configuration, video search and storage management, and evidence life-cycle maintenance. It is contemplated that a cloud based SaaS solution may offer the proper flexibility and convenience required for such law enforcement agencies. Additionally, the disclosed hybrid model for storing metadata independently from actual multi-media files may more effectively work for agencies having limited bandwidth capabilities.
  • In some disclosed embodiments, a remote application and database server may be hosted by a software as a service (SaaS) cloud application to reduce (or eliminate) the need to hire additional computer technicians. Some disclosed embodiments may be implemented in a hybrid cloud and provide local (on site) data storage for portions of data that require high bandwidth across a network (e.g., Internet, police network) while maintaining metadata in the cloud. This configuration may help ensure security and integrity of digital evidentiary data by maintaining a single global copy of metadata in the cloud (for storage) while still allowing fast local access speeds for review of potentially large video/audio files. Also, optionally, data on a shared server may be downloaded to the local data storage site as backup data and then re-uploaded to a remote (or cloud based) site if there is a systems failure or “intrusion” attack at the remote (or cloud based site).
  • To eliminate the need for (or to augment) a conventional DVD burner based system, the user may auto upload all data and metadata to the cloud. Optionally, a user may provide (or user event tags may be used as) identification criteria for certain types of videos (and their metadata) to be sent to the cloud automatically as soon as the videos are uploaded to a server (or staged on docking station 400) with certain “event type” metadata. For example, an administrator may define: all DUI videos are sent to cloud based storage and 2 DVD copies are burned. When an officer tags a video as a DUI event type, as soon as the video is uploaded to the cloud, the video may also be sent to a DVD burner for 2 copies automatically. Alternatively, rather than burning DVD copies, an email may be automatically generated and sent or instructions may be provided to an employee to create and send an email with a time limited access link to personnel or third parties (e.g., prosecuting attorney) who may have an interest in a DUI event. Based on the tag type assigned, a wide number of triggers and follow-on responses may be generated automatically. Furthermore, actions relating to compliance with record retention policies may be automatically generated so that as specific retention periods pass, records are automatically deleted. Thus, the user may readily and easily take advantage of cloud-based storage for an almost limitless cataloguing and archiving device.
  • Referring now to FIG. 6, data flow in a content management system that integrates with SaaS functionality is illustrated in block diagram 600. The SaaS component may be a system which typically includes a web-based portal that is the entry point to the software services for all users requiring data access. As with other data access points, access may be controlled by authentication means such as, but not limited to, passwords, fingerprints, encryption, and so on. Authorized users may search media catalogues which may be generated from metadata obtained from a single agency or from multiple jurisdictional agencies. Users may also manage all the configuration settings of mobile/portable video/audio recording devices via a cloud based control portal. Having metadata in the cloud facilitates many different functions, such as, query search of metadata associated with audio, video or print media. The metadata in the cloud and an associated interface portal may allow access to any evidentiary logs associated with the data (local or cloud based) and access a user's local hardware/software storage to review media that may not have been uploaded to cloud storage (e.g., because of bandwidth/storage constraints). That is, the cloud based system may include enough information to allow secure access back to local storage (e.g., 644 and 642) so that a user at police station 640 may efficiently view locally stored multi-media files. Alternatively, a user located remotely from police station 640 may obtain access (e.g., secure access via virtual private network VPN) to network and storage infrastructure at police station 640 and perform desired actions on multi-media files. Of course, bandwidth constraints of the obtained remote access (e.g., VPN) may have an effect on what actions a remote user decides to perform.
  • Local hardware/software storage at police station 640 may be any storage device, such as local hard drives, removable drives, or any type of network storage device, and so on. As shown in FIG. 6, the SaaS functions may incorporate cloud storage (630) which is not typically as limited in storage capacity as local hardware/software storage. However, remote access to large files may have associated communication bandwidth concerns. Such a SaaS content management system may limit data handling (and thus the potential for breaking the evidentiary chain of custody). Data handling may also be limited by initiating data transfer from the local collection point via an upload of data to the cloud storage using the web-based portal. The user may determine which data will remain on local storage and which data resides in the cloud. For example, in such a hybrid storage solution, metadata relating to GIS information and applications for performing data analysis may reside in the cloud, while the related audio/video files remain at the user's facility. This is largely based on the size of the files and recognition that bandwidth to cloud storage may affect access to large files. However, in some situations bandwidth concerns are not a determining factor and other segmentation of data may be desired. In the case where hybrid storage is implemented and a user has local access to large files, a user may more efficiently interact with metadata in the cloud and local multi-media files.
  • A cloud-based 630 video export and access system does away with the hardware and ongoing maintenance costs of optical media based systems by providing users a secure, controlled, reliable and cost-effective method for sending video and data to third parties. Video and data may be uploaded to the cloud for storage, one or more third party recipients may be assigned access rights, and a defined expiration date for third party access may also be provided. Additionally, use of the cloud may permit real-time data upload and storage which provides nearly limitless data storage capacity for integrated system 100 (FIGS. 1A and 1B). Hybrid storage models may be implemented to define pre-requisites as to what actual multi-media files are stored in the cloud. In some embodiments, only multi-media files requiring access by third parties are uploaded to the cloud. In other embodiments, only multi-media files that have been tagged with a particular event type are uploaded to the cloud. In either or both of these embodiments, other multi-media files that may be less important or have not yet been fully analyzed may be maintained on local storage for future consideration. Note that even though multi-media files may be maintained on local storage it may be desirable to upload associated metadata to the cloud based system to provide more comprehensive indexing and searching functionality across all recorded data.
  • Exported data may be stored in cloud-based storage that is remotely accessible through a secured means (for example, but not limited to, a password, finger print reader, etc). The system may be configured to send one or more recipients an access link through automated communication methods such as email, text, and mms, etc. The link sent to each recipient may include an expiration date for accessing the associated data. The system may also allow a recipient of the link to review the data stored in the cloud via the Internet, download a local copy of the data for future use, and delete the data after review or download. The link sent to each recipient may also limit access rights of recipients (e.g. read only, data editing, deletion, etc.).
  • In order to comply with laws, court orders or record-retention policies relating to data access, the system may be configured to remove the accessible data after a predetermined expiration date. A cloud-based system thus allows users to retain the original data while limiting third party access to such data. Once an access link has expired, no third party may access the expired data. The disclosed SaaS system may also provide bookkeeping functions to track content access, bandwidth usage, and subscription expiration, etc. This bookkeeping function may be capable of statistical analysis and billing and may generate reports and invoices as needed.
  • FIG. 6 also graphically illustrates an example data exchange flow in block diagram 600, thorough which video, audio, and print data and associated metadata may be shared. Numerous users, computer-based functionalities, storage options, and associated lines of communication may be involved in data uploading and downloading. For example, one or several police vehicles 610 may transmit video and audio data and associated metadata via wireless communication means 605 to a cloud storage system 630. Concurrently (or as needed), this data or a subset of this data may be made accessible to software applications, for example SaaS functions 620, via communication link 606. Police vehicle(s) 610 may also manually download data and metadata to local storage 644 upon arrival at police station 640 using data transmission channel 660. Data transmission channel 660 may be a wired connection or a wireless connection. In an alternative, a classical “sneakernet” may be used by connecting a portable recording device to another device (e.g., docking station 400). After connection data may be uploaded to local storage 644, which is located at the police station, and then optionally (based on a number of different criteria) to the cloud 630 using any appropriate connection (e.g., 645, 650 or another available connection).
  • In the example of block diagram 600, an integrated surveillance system vendor 670 oversees and maintains SaaS functions 620 utilizing communication channel 665. The vendor may also optionally maintain the security and integrity of any cloud based storage system 630 utilizing communication channel 666. Vendor 670 may also provide all necessary technical support through its SaaS functions 620 and communication channel 645 to assist police station 640 in implementing best practices in the preservation of data evidence. Police station 640, depending on available resources, may have “in-house” routers (not shown) and surveillance system backend server(s) 642 which provide redundant data storage systems. Police station 640, in order to avoid expensive data storage solutions, may optionally utilize cloud storage 630 via communication channel 650 in a hybrid manner. Cloud storage system 630 may also communicate directly with SaaS functions through communications channel 655. Having multiple channels of secured communications may provide rapid and efficient data exchange while use of various storage means, (locally or cloud-based) allows an inexpensive and flexible alternative to resource-limited users.
  • Referring now to FIG. 7, flow chart 700 illustrates a potential data mining strategy for captured data. The disclosed data mining strategy may benefit from the above discussed hybrid storage model in a number of ways. Example benefits across a single agency or multiple jurisdictionally distinct entities may include sharing of information without violating privacy or other data access concerns. Sample use cases are described following this overview of flow chart 700. Beginning at block 705, at least one surveillance system automatically captures video and audio data and associates that captured data with GPS positioning, timestamp, and other information captured as metadata while the vehicle containing the surveillance system is “on-patrol”. All such video and audio data (including metadata) from a single or multiple “on-patrol” vehicles at block 710 may be uploaded to a central storage area (e.g., a cloud) at the end of a law enforcement personnel shift. At decision 715, it is determined by specially configured software/firmware if a captured data segment has an associated tag (e.g., event type). If not (NO prong of decision 715) then a default tag and associated data retention policy may be applied to the captured data as shown at block 725. The captured data segment for untagged capture may be stored in an area of a computer hard drive with continuous loop function such that oldest data is overwritten by newer data. Alternatively, if a data segment has an associated tag (the YES prong of decision 715), a retention criteria based on the tag type may be applied and the appropriate data stored with other tagged data as illustrated by block 720. As required, any necessary evidentiary access controls as illustrated by block 720 may be considered. At block 730, data mining is performed. Information gathered from the data mining function may be used to provide a global index of data (e.g., index of data across all available metadata). Indexed data remains available until the time limit for data retention is reached and then data (and its associated index information) may be expunged. However, if as in block 735, an unpredicted event occurs, for example, a bombing, terrorist activity, report of previous criminal activity etc., then, at block 740, the data mined in block 730 (for a particular location, date and time) may be retrieved to assist with investigation and evidence gathering. Optionally, as shown in block 745, the overall system may be configured to proactively apply analytics to the captured metadata to identify possible criminal activity or potential threats to public health and safety (e.g., face or pattern recognition analysis to identify a known criminal or threat). Such analysis may then be used to produce an analytics report as is shown at block 750. The analytics report, for example, may then be reviewed by law enforcement personnel to assist with an investigation or determine if further investigation is required.
  • Collecting metadata from multiple surveillance systems to create a comprehensive index may allow a law-enforcement agency to correlate information from different systems. For example, if a set of recordings from different patrol cars at a given geographical location are of interest, then the metadata containing GPS information may identify a subset of multi-media files that may be of interest. If multiple agencies use a common global index, they may be made aware of recordings that other agencies have obtained that would otherwise be unknown to them. After following appropriate legal procedures, they may obtain access to recordings from other agencies to assist in gathering evidence. Note that access to actual multi-media recordings may not be made available because of privacy concerns, for example, but the global index informs of the existence of potentially relevant information. In this manner, coordinated inter-agency information sharing may be enhanced. The hybrid storage model facilitates creation of a global index because the overall size of actual multi-media recordings across numerous surveillance devices may quickly become unmanageable. Additionally, chain of custody of evidence and access controls to actual multi-media files may be maintained.
  • Each agency may implement the hybrid storage model as necessary based on their size and infrastructure capabilities. Hybrid techniques may also be implemented as a sliding scale. That is, at one extreme a maximal hybrid technique uploads all (or nearly all) captured metadata and associated multi-media files. For example, a large police station with a big cloud presence and high bandwidth might use the maximal hybrid model. In another extreme, a minimal hybrid technique would upload only enough metadata for indexing and very few (if any) multi-media files. The minimal amount of metadata may allow for global indexing as necessary so that, when required, additional upload of data may be requested from the agency implementing the minimal hybrid model.
  • Referring now to FIGS. 8A-F, examples of the metadata referenced throughout this disclosure are shown in an example XML file format. Note that because of the structure provided by XML each of the metadata portions of FIGS. 8A-F may be stored in a single file, multiple files or any other appropriate segregation. Each element of an XML file is partitioned by tags (i.e., <row> followed by </row> as shown in FIGS. 8A-F). For example, a start tag (i.e., <row>) as shown at the beginning of element 805 and an end tag (i.e., </row>) as shown at the end of element 805. According to some disclosed embodiments, the actual root name of the file (e.g. filename with no extension) is used as a key for associating the recorded audio/video with the appropriate metadata file. Inside the example element 805 of FIG. 8A there are attribute/value pairs to provide a metadata parameter name and its associated value for that attribute. Metadata attributes shown in FIGS. 8A-F have self-evident names and therefore are not discussed individually here. The examples provided are simply to illustrate that a multitude of different types of data may be captured and used to index or further maintain associated captured surveillance data. In these examples, FIG. 8A illustrates a video metadata file for a captured video segment while FIG. 8F illustrates an XML segment that contains the VX-Sync data. In this embodiment, the “V” of “VX” is used to reference the particular video and the “X” to reference any event variable associated with the particular video. For example, during the recording, any action taken by a user, such as an action that triggered the recording, e.g. pushing a wireless microphone, or activation of a light bar, or the taking of a snapshot, will be recorded in this metadata file and associated with the variable X for future connection with the video V. FIG. 8B illustrates a sample metadata file with attributes and values relating to in-car activity logs based on personnel shifts, while FIG. 8C illustrates a file portion that relates to GPS location metadata based on personnel shifts, and FIG. 8D illustrates in-car error logs per personnel shift. In the example shown in FIG. 8D, there are no errors to report. FIG. 8E illustrates an example metadata file that may be used to establish an audit trail for the associated video to satisfy evidentiary requirements relating to chain of custody.
  • FIG. 8C illustrates a series of “collected” metadata elements where several attributes have been assigned values based on data collection. For example, the attribute “patrol unit” has a value to identify a particular police vehicle and the officerID attribute has a value corresponding to the identification of a specific officer. Note that officerID may be initially blank as in elements 820 and 825, and then be assigned an ID number as an officer logs onto (e.g., successfully authenticates to) the integrated system 100 (FIGS. 1A and 1B) as in shown in element 830. Another attribute may be “log time” (element 835, FIG. 8E) which is the date and time that a data record is captured. Yet other attributes, which are self explanatory based on their name, may indicate changes in longitude and latitude reflective of the vehicle that is in motion. In addition the speed and velocity of the vehicle in motion may be reflected in the metadata.
  • In light of the principles and example embodiments described and illustrated herein, it will be recognized that the example embodiments can be modified in arrangement and detail without departing from such principles. Also, the foregoing discussion has focused on particular embodiments, but other configurations are also contemplated. In particular, even though expressions such as “in one embodiment,” “in another embodiment,” or the like are used herein, these phrases are meant to generally reference embodiment possibilities, and are not intended to limit the invention to particular embodiment configurations. As used herein, these terms may reference the same or different embodiments that are combinable into other embodiments. As a rule, any embodiment referenced herein is freely combinable with any one or more of the other embodiments referenced herein, and any number of features of different embodiments are combinable with one another, unless indicated otherwise.
  • Similarly, although example processes have been described with regard to particular operations performed in a particular sequence, numerous modifications might be applied to those processes to derive numerous alternative embodiments of the present invention. For example, alternative embodiments may include processes that use fewer than all of the disclosed operations, processes that use additional operations, and processes in which the individual operations disclosed herein are combined, subdivided, rearranged, or otherwise altered.
  • This disclosure may include descriptions of various benefits and advantages that may be provided by various embodiments. One, some, all, or different benefits or advantages may be provided by different embodiments. In view of the wide variety of useful permutations that may be readily derived from the example embodiments described herein, this detailed description is intended to be illustrative only, and should not be taken as limiting the scope of the invention. What is claimed as the invention, therefore, are all implementations that come within the scope of the following claims, and all equivalents to such implementations.

Claims (25)

1. A computer system, comprising:
one or more processors; and
one or more network communication interfaces communicatively coupled to the one or more processors,
wherein the one or more processors are configured to execute instructions to cause the one or more processors to:
receive, via the one or more network communication interfaces, first metadata information pertaining to a multi-media recording, the first metadata information comprising information regarding attributes describing recording circumstances of the multi-media recording and an access location of the multi-media recording;
initiate transmission of at least a portion of the first metadata information to a first storage location; and
categorize the multi-media recording using the first metadata information independently of the multi-media recording.
2. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
process the first metadata information pertaining to the multi-media recording; and
incorporate the processed first metadata information in an index containing information pertaining to additional multi-media recordings obtained from multiple distinct portable recording devices.
3. The computer system of claim 2, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
receive a query request to identify potentially applicable multi-media recordings; and
utilize the index to provide a response to the query request, wherein the response to the query request comprises metadata information pertaining to one or more potentially applicable multi-media recordings or information identifying a storage location of one or more potentially applicable multi-media recordings.
4-6. (canceled)
7. The computer system of claim 1, wherein the information regarding attributes describing recording circumstances comprises information pertaining to items selected from the group consisting of: recording location, recording time, recording initiation, recording termination, recording duration, event type/tag, and a user who performed the multi-media recording.
8. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
apply audit controls regarding access to and/or alteration of the first metadata information and/or the multi-media recording.
9. (canceled)
10. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
receive an indication identifying one or more potentially applicable multi-media recordings, the indication responsive to a query request; and
request initiation of transmission of at least one of the one or more potentially applicable multi-media recordings from a remote storage location to a storage location accessible to the computer system.
11. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
apply a data retention policy to the multi-media recording, wherein the data retention policy indicates a data retention period for the multi-media recording based on the first metadata information.
12-22. (canceled)
23. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
initiate transmission of the multi-media recording to a second storage location different from the first storage location.
24. The computer system of claim 23, wherein the first storage location is a local network accessible storage location and the second storage location is a remote storage location.
25. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
copy the first metadata information and/or the multi-media recording to a local storage area distinct from the first storage location.
26. The computer system of claim 1, further comprising:
a local storage area communicatively coupled to the one or more processors; and
a plug-in port communicatively coupled to the one or more processors and configured to interface with a portable recording device,
wherein the initiation of transmission of the first metadata information to the first storage location and/or initiation of transmission of the multi-media recording to a second storage location occurs automatically upon connection of a portable recording device to the plug-in port, the portable recording device storing the first metadata information and/or the multimedia recording, respectively.
27. The computer system of claim 1, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
correlate the first metadata information with additional metadata information to produce correlated metadata information, the additional metadata information comprising information regarding attributes describing recording circumstances of one or more additional multi-media recordings.
28. The computer system of claim 27, wherein the one or more processors are further configured to execute instructions to cause the one or more processors to:
provide an interface to access the correlated information, wherein the interface provides information to assist in audit control of the first and additional metadata information and the first and the one or more additional multi-media recordings.
29. The computer system of claim 27, wherein the correlating of the first metadata information with the additional metadata information to produce correlated metadata information comprises correlating the first and the one or more additional multi-media recordings based on an event type associated with each of the first and the one or more multi-media recordings, respectively, a recording location of each of the first and the one or more multi-media recordings, respectively, and/or a recording time of each of the first and the one or more multi-media recordings, respectively.
30. A method comprising:
receiving at a computer system, via one or more network communication interfaces, first metadata information pertaining to a first multi-media recording, the first metadata information comprising information regarding attributes describing recording circumstances of the first multi-media recording and an access location of the first multi-media recording;
initiating transmission of at least a portion of the first metadata information to a first storage location, the first storage location comprising a storage area configured for storing at least metadata; and
categorizing the first multi-media recording using the first metadata information independently of the first multi-media recording.
31. The method of claim 30, further comprising:
receiving a query request to identify potentially applicable multi-media recordings; and
utilizing an index to provide a response to the query request,
wherein the response to the query request comprises metadata information pertaining to one or more potentially applicable multi-media recordings or information identifying a storage location of one or more potentially applicable multi-media recordings, and
wherein the index contains metadata information pertaining to multi-media recordings obtained from one or more portable recording devices.
32. The method of claim 30, further comprising:
applying audit controls regarding access to and/or alteration of the first metadata information and/or the first multi-media recording.
33. The method of claim 30, further comprising:
receiving an indication identifying one or more potentially applicable multi-media recordings, the indication responsive to a query request; and
requesting initiation of transmission of at least one of the one or more potentially applicable multi-media recordings from a remote storage location to a storage location accessible to the computer system.
34. The method of claim 30, further comprising:
applying a data retention policy to the first multi-media recording, wherein the data retention policy indicates a data retention period for the first multi-media recording based on the first metadata information.
35. The method of claim 30, further comprising:
initiating transmission of the first multi-media recording to a second storage location different from the first storage location, the second storage location comprising a storage area configured for storing at least multi-media recordings.
36. The method of claim 35, wherein the first storage location is a local network accessible storage location and the second storage location is a remote storage location.
37. The method of claim 30, further comprising:
correlating the first metadata information with additional metadata information to produce correlated metadata information, the additional metadata information comprising information regarding attributes describing recording circumstances of one or more additional multi-media recordings; and
generating an index based on the correlated metadata information, the index containing information pertaining to the first and the one or more additional multi-media recordings.
US14/686,192 2014-08-29 2015-04-14 Shared server methods and systems for information storage, access, and security Abandoned US20160062992A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/686,192 US20160062992A1 (en) 2014-08-29 2015-04-14 Shared server methods and systems for information storage, access, and security
PCT/US2015/047532 WO2016033523A1 (en) 2014-08-29 2015-08-28 Compact multi-function dvr with multiple integrated wireless data communication devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462044139P 2014-08-29 2014-08-29
US14/686,192 US20160062992A1 (en) 2014-08-29 2015-04-14 Shared server methods and systems for information storage, access, and security

Publications (1)

Publication Number Publication Date
US20160062992A1 true US20160062992A1 (en) 2016-03-03

Family

ID=54932528

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/588,139 Expired - Fee Related US9225527B1 (en) 2014-08-29 2014-12-31 Hidden plug-in storage drive for data integrity
US14/686,192 Abandoned US20160062992A1 (en) 2014-08-29 2015-04-14 Shared server methods and systems for information storage, access, and security
US14/715,742 Abandoned US20160064036A1 (en) 2014-08-29 2015-05-19 Cloud information storage, access, and security

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/588,139 Expired - Fee Related US9225527B1 (en) 2014-08-29 2014-12-31 Hidden plug-in storage drive for data integrity

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/715,742 Abandoned US20160064036A1 (en) 2014-08-29 2015-05-19 Cloud information storage, access, and security

Country Status (1)

Country Link
US (3) US9225527B1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160351030A1 (en) * 2015-06-01 2016-12-01 Securonet Virtual safety network
US20170048556A1 (en) * 2014-03-07 2017-02-16 Dean Drako Content-driven surveillance image storage optimization apparatus and method of operation
CN107241583A (en) * 2017-07-28 2017-10-10 中国电信股份有限公司广东号百信息服务分公司 A kind of vehicle-mounted cloud resource intelligent dispatching system and method
EP3229174A1 (en) * 2016-04-06 2017-10-11 L-1 Identity Solutions AG Method for video investigation
US20180091781A1 (en) * 2009-11-09 2018-03-29 Verint Americas Inc. Method and apparatus to transmit video data
US20180131898A1 (en) * 2016-08-24 2018-05-10 WHP Worflow Solutions, LLC Portable recording device multimedia classification system
US10152858B2 (en) 2016-05-09 2018-12-11 Coban Technologies, Inc. Systems, apparatuses and methods for triggering actions based on data capture and characterization
US10165171B2 (en) 2016-01-22 2018-12-25 Coban Technologies, Inc. Systems, apparatuses, and methods for controlling audiovisual apparatuses
US20190068605A1 (en) * 2017-08-30 2019-02-28 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. System and method for providing access to secured data via a push notification
US10248335B2 (en) * 2016-07-14 2019-04-02 International Business Machines Corporation Reducing a size of backup data in storage
US10339970B2 (en) * 2016-04-26 2019-07-02 Idis Co., Ltd. Video recording apparatus with pre-event circulation recording function
US10370102B2 (en) 2016-05-09 2019-08-06 Coban Technologies, Inc. Systems, apparatuses and methods for unmanned aerial vehicle
US20200036945A1 (en) * 2018-07-24 2020-01-30 Comcast Cable Communications, Llc Neighborhood Proximity Media Capture For Life Safety Alarm Events
US20200117755A1 (en) * 2018-10-12 2020-04-16 International Business Machines Corporation Intelligent video bridge for a closed circuit television system
US10789840B2 (en) 2016-05-09 2020-09-29 Coban Technologies, Inc. Systems, apparatuses and methods for detecting driving behavior and triggering actions based on detected driving behavior
US11042305B2 (en) 2017-11-06 2021-06-22 Toshiba Memory Corporation Memory system and method for controlling nonvolatile memory
US11116033B2 (en) * 2016-07-11 2021-09-07 Motorola Solutions, Inc. Method and apparatus for disassociating from a network
US11210330B2 (en) * 2016-07-13 2021-12-28 Hangzhou Hikvision Digital Technology Co., Ltd. Method and apparatus for storing, reading, and displaying plurality of multimedia files
US20210406556A1 (en) * 2017-01-26 2021-12-30 Matias Klein Total Property Intelligence System
US11343472B2 (en) 2020-03-17 2022-05-24 Axis Ab Associating captured media to a party
US11386141B1 (en) * 2016-01-25 2022-07-12 Kelline ASBJORNSEN Multimedia management system (MMS)
US11501391B2 (en) 2018-12-20 2022-11-15 Motorola Solutions, Inc. Method and operation of a portable device and a cloud server for preserving the chain of custody for digital evidence
US11785266B2 (en) 2022-01-07 2023-10-10 Getac Technology Corporation Incident category selection optimization

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6011833B1 (en) * 2015-09-14 2016-10-19 パナソニックIpマネジメント株式会社 Wearable camera system and person notification method
US10230792B2 (en) * 2015-10-23 2019-03-12 International Business Machines Corporation Synchronizing proprietary data in an external cloud with data in a private storage system
US10904474B2 (en) * 2016-02-05 2021-01-26 Digital Ally, Inc. Comprehensive video collection and storage
US10498726B2 (en) 2016-03-22 2019-12-03 International Business Machines Corporation Container independent secure file system for security application containers
WO2017212499A1 (en) * 2016-06-07 2017-12-14 Vijay Mann Systems and methods for storage space management and high availability of digital media on mobile devices
US10241869B2 (en) * 2017-03-08 2019-03-26 International Business Machines Corporation Managing a deletion of a volume referenced by a snapshot of a consistency group
CN107371047A (en) * 2017-07-31 2017-11-21 天脉聚源(北京)教育科技有限公司 The continuity check method and device of video fragment
CN108055344A (en) * 2017-12-26 2018-05-18 重庆天智慧启科技有限公司 A kind of patrol system and night watching method based on mobile terminal device
EP4220535B1 (en) * 2018-03-19 2025-07-02 Axis AB System and method for handling data captured by a body worn camera
US11423161B1 (en) * 2018-05-26 2022-08-23 Genetec Inc. System and media recording device with secured encryption
US11024137B2 (en) * 2018-08-08 2021-06-01 Digital Ally, Inc. Remote video triggering and tagging
US10999310B2 (en) 2018-11-27 2021-05-04 Seagate Technology Llc Endpoint security client embedded in storage drive firmware
US11392711B2 (en) 2019-03-21 2022-07-19 Microsoft Technology Licensing, Llc Authentication state-based permission model for a file storage system
US11954065B2 (en) 2022-03-18 2024-04-09 Motorola Solutions, Inc. Device and method for extending retention periods of records
US11950017B2 (en) 2022-05-17 2024-04-02 Digital Ally, Inc. Redundant mobile video recording
CN119718998A (en) * 2023-09-27 2025-03-28 长江存储科技有限责任公司 Electronic device and operation method thereof, memory system, and storage medium

Family Cites Families (223)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4344184A (en) 1980-07-31 1982-08-10 Cetec Corporation Wireless microphone
US4543665A (en) 1982-07-13 1985-09-24 Plantronics, Inc. Speakerphone with wireless infrared microphone
DE3302876A1 (en) 1983-01-28 1984-08-02 Robert Bosch Gmbh, 7000 Stuttgart DIPOLANTENNA FOR PORTABLE RADIO DEVICES
US4910795A (en) 1987-06-11 1990-03-20 Mccowen Clinton R Wireless hand held microphone
US5012335A (en) 1988-06-27 1991-04-30 Alija Cohodar Observation and recording system for a police vehicle
US5111289A (en) 1990-04-27 1992-05-05 Lucas Gary L Vehicular mounted surveillance and recording system
US5408330A (en) 1991-03-25 1995-04-18 Crimtec Corporation Video incident capture system
US5477397A (en) 1993-02-23 1995-12-19 Matsushita Electric Corporation Of America Digital high definition television receiver with features that facilitate trick-play modes on a digital VCR
US5983161A (en) 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
US6122403A (en) 1995-07-27 2000-09-19 Digimarc Corporation Computer system linked by using information in data objects
US5841978A (en) 1993-11-18 1998-11-24 Digimarc Corporation Network linking method using steganographically embedded data objects
US5862260A (en) 1993-11-18 1999-01-19 Digimarc Corporation Methods for surveying dissemination of proprietary empirical data
US5613032A (en) 1994-09-02 1997-03-18 Bell Communications Research, Inc. System and method for recording, playing back and searching multimedia events wherein video, audio and text can be searched and retrieved
US6002326A (en) 1994-09-19 1999-12-14 Valerie Turner Automotive vehicle anti-theft and anti-vandalism and anti-carjacking system
WO1997029550A1 (en) 1996-02-07 1997-08-14 L.S. Research, Inc. Digital wireless speaker system
GB2298100A (en) 1995-02-07 1996-08-21 Peng Seng Toh High resolution video imaging system for simultaneous acquisition of two high aspect ratio object fields
US5724475A (en) 1995-05-18 1998-03-03 Kirsten; Jeff P. Compressed digital video reload and playback system
US6505160B1 (en) 1995-07-27 2003-01-07 Digimarc Corporation Connected audio and other media objects
JP3484834B2 (en) 1995-07-28 2004-01-06 ソニー株式会社 Data encoding / decoding method and apparatus
WO1997038526A1 (en) 1996-04-08 1997-10-16 Skaggs Telecommunications Service, Inc. Law enforcement video documentation system and methods
US5926218A (en) 1996-06-04 1999-07-20 Eastman Kodak Company Electronic camera with dual resolution sensors
US5815093A (en) 1996-07-26 1998-09-29 Lextron Systems, Inc. Computerized vehicle log
US6038257A (en) 1997-03-12 2000-03-14 Telefonaktiebolaget L M Ericsson Motion and still video picture transmission and display
US5970098A (en) 1997-05-02 1999-10-19 Globespan Technologies, Inc. Multilevel encoder
US6188939B1 (en) 1997-08-18 2001-02-13 The Texas A&M University System Advanced law enforcement and response technology
US6028528A (en) 1997-10-24 2000-02-22 Mobile-Vision, Inc. Apparatus and methods for managing transfers of video recording media used for surveillance from vehicles
US6175860B1 (en) 1997-11-26 2001-01-16 International Business Machines Corporation Method and apparatus for an automatic multi-rate wireless/wired computer network
US6163338A (en) 1997-12-11 2000-12-19 Johnson; Dan Apparatus and method for recapture of realtime events
AU4223399A (en) 1998-06-01 1999-12-20 Robert Jeff Scaman Secure, vehicle mounted, incident recording system
US6445408B1 (en) 1998-07-22 2002-09-03 D. Scott Watkins Headrest and seat video imaging apparatus
US7197228B1 (en) 1998-08-28 2007-03-27 Monroe David A Multifunction remote control system for audio and video recording, capture, transmission and playback of full motion and still images
US6181693B1 (en) 1998-10-08 2001-01-30 High Speed Video, L.L.C. High speed video transmission over telephone lines
US6141611A (en) 1998-12-01 2000-10-31 John J. Mackey Mobile vehicle accident data system
AU2857900A (en) 1999-01-29 2000-08-18 Sony Electronics Inc. Method and apparatus for associating environmental information with digital images
US6518881B2 (en) 1999-02-25 2003-02-11 David A. Monroe Digital communication system for law enforcement use
US6462778B1 (en) 1999-02-26 2002-10-08 Sony Corporation Methods and apparatus for associating descriptive data with digital image files
US6424820B1 (en) 1999-04-02 2002-07-23 Interval Research Corporation Inductively coupled wireless system and method
DE60001562D1 (en) 1999-05-25 2003-04-10 Swtv Production Services Inc METHOD AND DEVICE FOR CREATING DIGITAL ARCHIVES
US7631195B1 (en) * 2006-03-15 2009-12-08 Super Talent Electronics, Inc. System and method for providing security to a portable storage device
US6421080B1 (en) 1999-11-05 2002-07-16 Image Vault Llc Digital surveillance system with pre-event recording
US6675027B1 (en) 1999-11-22 2004-01-06 Microsoft Corp Personal mobile computing device having antenna microphone for improved speech recognition
US7934251B2 (en) 1999-12-02 2011-04-26 Western Digital Technologies, Inc. Managed peer-to-peer applications, systems and methods for distributed data access and storage
JP2001189668A (en) 1999-12-28 2001-07-10 Circuit Design:Kk Wireless microphone device and transmitter device for wireless microphone
US6298290B1 (en) 1999-12-30 2001-10-02 Niles Parts Co., Ltd. Memory apparatus for vehicle information data
JP5148797B2 (en) 2000-02-08 2013-02-20 ソニー株式会社 Video data recording apparatus and video data recording method
US6510177B1 (en) 2000-03-24 2003-01-21 Microsoft Corporation System and method for layered video coding enhancement
DE10018157A1 (en) 2000-04-12 2001-10-18 Bosch Gmbh Robert Monitoring device
JP3938460B2 (en) 2000-05-24 2007-06-27 株式会社リコー Information recording apparatus, information recording method, recording medium on which information recording processing program is recorded, optical disc recording apparatus, optical disc recording method, information recording system, and optical disc recording system
JP3563326B2 (en) 2000-06-20 2004-09-08 松下電器産業株式会社 Wireless microphone communication system
US6789030B1 (en) 2000-06-23 2004-09-07 Bently Nevada, Llc Portable data collector and analyzer: apparatus and method
US7155615B1 (en) * 2000-06-30 2006-12-26 Intel Corporation Method and apparatus for providing a secure-private partition on a hard disk drive of a computer system via IDE controller
WO2002061955A2 (en) 2000-10-17 2002-08-08 Synapse, Inc. System and method for wireless data exchange between an appliance and a handheld device
US7868912B2 (en) 2000-10-24 2011-01-11 Objectvideo, Inc. Video surveillance system employing video primitives
DE10053683A1 (en) 2000-10-28 2002-05-08 Alcatel Sa Image monitoring
US6788338B1 (en) 2000-11-20 2004-09-07 Petko Dimitrov Dinev High resolution video camera apparatus having two image sensors and signal processing
US8126968B2 (en) 2000-12-26 2012-02-28 Polycom, Inc. System and method for coordinating a conference using a dedicated server
US6788983B2 (en) 2001-02-07 2004-09-07 The Hong Kong Polytechnic University Audio trigger devices
US8126276B2 (en) 2001-02-21 2012-02-28 International Business Machines Corporation Business method for selectable semantic codec pairs for very low data-rate video transmission
US7190882B2 (en) 2001-03-19 2007-03-13 Applied Concepts, Inc. In-car digital video recording with MPEG-4 compression for police cruisers and other vehicles
US6831556B1 (en) 2001-05-16 2004-12-14 Digital Safety Technologies, Inc. Composite mobile digital information system
US7119832B2 (en) 2001-07-23 2006-10-10 L-3 Communications Mobile-Vision, Inc. Wireless microphone for use with an in-car video system
US20030052970A1 (en) 2001-09-19 2003-03-20 Dodds G. Alfred Automatically activated wireless microphone for in-car video system
US20030081127A1 (en) 2001-10-30 2003-05-01 Kirmuss Charles Bruno Mobile digital video recording with pre-event recording
US20030095688A1 (en) 2001-10-30 2003-05-22 Kirmuss Charles Bruno Mobile motor vehicle identification
US20030080878A1 (en) 2001-10-30 2003-05-01 Kirmuss Charles Bruno Event-based vehicle image capture
US20030081935A1 (en) 2001-10-30 2003-05-01 Kirmuss Charles Bruno Storage of mobile video recorder content
US6624611B2 (en) 2001-10-30 2003-09-23 Taw Security Concepts, Inc. Sensing vehicle battery charging and/or engine block heating to trigger pre-heating of a mobile electronic device
US20070124292A1 (en) 2001-10-30 2007-05-31 Evan Kirshenbaum Autobiographical and other data collection system
WO2003039144A2 (en) 2001-11-01 2003-05-08 A4S Technologies, Inc. Remote surveillance system
JP3979826B2 (en) 2001-11-14 2007-09-19 Tdk株式会社 Memory controller, memory system, and memory control method
US7835530B2 (en) 2001-11-26 2010-11-16 Cristiano Avigni Systems and methods for determining sound of a moving object
US7167519B2 (en) 2001-12-20 2007-01-23 Siemens Corporate Research, Inc. Real-time video object generation for smart cameras
US8068023B2 (en) 2001-12-28 2011-11-29 Dulin Jacques M System for maintaining security of evidence throughout chain of custody
US20060165386A1 (en) 2002-01-08 2006-07-27 Cernium, Inc. Object selective video recording
US20030151663A1 (en) 2002-01-23 2003-08-14 Mobile-Vision, Inc. Video storage and delay device for use with an in-car video system
KR100519260B1 (en) 2002-02-21 2005-10-06 주식회사 애드일렉코 Rapidly optimized wireless mike and method thereof
US6825780B2 (en) 2002-04-19 2004-11-30 Droplet Technology, Inc. Multiple codec-imager system and method
US7091851B2 (en) 2002-07-02 2006-08-15 Tri-Sentinel, Inc. Geolocation system-enabled speaker-microphone accessory for radio communication devices
US20040008255A1 (en) 2002-07-11 2004-01-15 Lewellen Mark A. Vehicle video system and method
GB2391687A (en) 2002-08-09 2004-02-11 Tecton Ltd Digital video recorder with simultaneous record and playback capability
US20040051793A1 (en) 2002-09-18 2004-03-18 Tecu Kirk S. Imaging device
US7693289B2 (en) 2002-10-03 2010-04-06 Audio-Technica U.S., Inc. Method and apparatus for remote control of an audio source such as a wireless microphone system
US20050185936A9 (en) 2002-11-08 2005-08-25 Ich-Kien Lao Mobile and vehicle-based digital video system
US20040177253A1 (en) 2002-11-19 2004-09-09 My Ez Communications, Llc. Automated and secure digital mobile video monitoring and recording
FR2849738B1 (en) 2003-01-08 2005-03-25 Holding Bev Sa PORTABLE TELEPHONE VIDEO SURVEILLANCE DEVICE, OPERATING METHOD, APPLICABLE, AND TAMPERING NETWORK
US20040146272A1 (en) 2003-01-09 2004-07-29 Kessel Kurt A. System and method for managing video evidence
US7554587B2 (en) 2003-03-11 2009-06-30 Fujifilm Corporation Color solid-state image pickup device
EP1627524A4 (en) 2003-03-20 2009-05-27 Ge Security Inc Systems and methods for multi-resolution image processing
ATE369694T1 (en) 2003-04-29 2007-08-15 Sony Ericsson Mobile Comm Ab USER INTERFACE UNIT FOR A TELEPHONE
US8270647B2 (en) 2003-05-08 2012-09-18 Advanced Bionics, Llc Modular speech processor headpiece
US20050007458A1 (en) 2003-05-14 2005-01-13 Frederic Benattou Standalone remote video surveillance device
JP4806515B2 (en) 2003-05-19 2011-11-02 株式会社日立製作所 Encoding apparatus, video camera apparatus using the same, and encoding method
CN1302382C (en) 2003-06-13 2007-02-28 联想(北京)有限公司 Verification method based on storage medium private space of USB flash memory disc
JP2005511607A (en) 2003-07-10 2005-04-28 アラーガン、インコーポレイテッド Wireless microphone communication system
AU2003275331A1 (en) 2003-08-26 2005-04-14 Icop Digital Data acquisition and display system and method of operating the same
US7551894B2 (en) 2003-10-07 2009-06-23 Phonak Communications Ag Wireless microphone
US20050088521A1 (en) 2003-10-22 2005-04-28 Mobile-Vision Inc. In-car video system using flash memory as a recording medium
US20050243171A1 (en) 2003-10-22 2005-11-03 Ross Charles A Sr Data acquisition and display system and method of establishing chain of custody
US20060077256A1 (en) 2003-11-07 2006-04-13 Silvemail William B High resolution pre-event record
US7231233B2 (en) 2003-11-25 2007-06-12 G Squared, Llc Combined multi-media and in ear monitoring system and method of remote monitoring and control thereof
US20050113021A1 (en) 2003-11-25 2005-05-26 G Squared, Llc Wireless communication system for media transmission, production, recording, reinforcement and monitoring in real-time
US7428314B2 (en) 2003-12-03 2008-09-23 Safehouse International Inc. Monitoring an environment
AU2004233453B2 (en) 2003-12-03 2011-02-17 Envysion, Inc. Recording a sequence of images
KR100994772B1 (en) 2004-01-10 2010-11-16 삼성전자주식회사 How to copy and play data on a storage medium
GB2410098B (en) * 2004-01-16 2006-10-11 Sony Uk Ltd Security system
JP2007528645A (en) 2004-02-17 2007-10-11 タレス アビオニクス インコーポレイテッド Multiple camera surveillance system and its use
EP1721237B1 (en) 2004-02-27 2012-08-29 Simon Richard Daniel Wearable modular interface strap
JP2005266934A (en) 2004-03-16 2005-09-29 Hagiwara Sys-Com:Kk Usb storage device and controller therefor
US20060055521A1 (en) 2004-09-15 2006-03-16 Mobile-Vision Inc. Automatic activation of an in-car video recorder using a GPS speed signal
US8253796B2 (en) 2004-09-30 2012-08-28 Smartvue Corp. Wireless video surveillance system and method with rapid installation
US9071847B2 (en) 2004-10-06 2015-06-30 Microsoft Technology Licensing, Llc Variable coding resolution in video codec
JP4513487B2 (en) 2004-10-06 2010-07-28 株式会社日立製作所 Video data compression device
US8081214B2 (en) 2004-10-12 2011-12-20 Enforcement Video, Llc Method of and system for mobile surveillance and event recording
US20060078046A1 (en) 2004-10-13 2006-04-13 Aegison Corp. Method and system for multi-path video delivery and distribution
EP1655855A1 (en) 2004-11-03 2006-05-10 Topspeed Technology Corp. Method of Frequency Hopping for avoiding interference
US20060133476A1 (en) 2004-11-12 2006-06-22 Page Warren S Digital in-car video surveillance system
US7356473B2 (en) 2005-01-21 2008-04-08 Lawrence Kates Management and assistance system for the deaf
US7778601B2 (en) 2005-01-24 2010-08-17 Broadcom Corporation Pairing modular wireless earpiece/microphone (HEADSET) to a serviced base portion and subsequent access thereto
US8489151B2 (en) 2005-01-24 2013-07-16 Broadcom Corporation Integrated and detachable wireless headset element for cellular/mobile/portable phones and audio playback devices
US7877115B2 (en) 2005-01-24 2011-01-25 Broadcom Corporation Battery management in a modular earpiece microphone combination
US20060270465A1 (en) 2005-05-31 2006-11-30 Matthew Lee Wireless microphone for public safety use
US20060274116A1 (en) 2005-06-01 2006-12-07 Wu Carl L Ink-jet assembly coatings and related methods
US7818078B2 (en) 2005-06-06 2010-10-19 Gonzalo Fuentes Iriarte Interface device for wireless audio applications
US8289370B2 (en) 2005-07-20 2012-10-16 Vidyo, Inc. System and method for scalable and low-delay videoconferencing using scalable video coding
US7768548B2 (en) 2005-08-12 2010-08-03 William Bradford Silvernail Mobile digital video recording system
US20070064108A1 (en) 2005-09-16 2007-03-22 Haler Robert D Rear view mirror with integrated video system
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US8982944B2 (en) 2005-10-12 2015-03-17 Enforcement Video, Llc Method and system for categorized event recording of images in multiple resolution levels
US20070086601A1 (en) 2005-10-17 2007-04-19 Mitchler Dennis W Flexible wireless air interface system
DE102005054258B4 (en) 2005-11-11 2015-10-22 Sennheiser Electronic Gmbh & Co. Kg A method of assigning a frequency for wireless audio communication
US20070111754A1 (en) 2005-11-14 2007-05-17 Marshall Bill C User-wearable data acquisition system including a speaker microphone that is couple to a two-way radio
JP2007181083A (en) 2005-12-28 2007-07-12 Sony Corp Wireless microphone and transmitter mounting arrangement of wireless microphone
GB2433845A (en) 2005-12-29 2007-07-04 Motorola Inc Computer with retractable socket
US7688203B2 (en) 2006-01-12 2010-03-30 Alfred Gerhold Rockefeller Surveillance device by use of digital cameras linked to a cellular or wireless telephone
US20070217761A1 (en) 2006-03-07 2007-09-20 Coban Research And Technologies, Inc. Method for video/audio recording using unrestricted pre-event/post-event buffering with multiple bit and frame rates buffer files
TWM297121U (en) 2006-03-13 2006-09-01 Mipro Electronics Co Ltd Battery device of wireless microphone
CN2907145Y (en) 2006-04-04 2007-05-30 中国科学院声学研究所 A multi-channel wireless microphone system
US20080005472A1 (en) 2006-06-30 2008-01-03 Microsoft Corporation Running applications from removable media
JP2008030287A (en) 2006-07-28 2008-02-14 Fuji Xerox Co Ltd Printing apparatus, printing system and program
JP4381402B2 (en) 2006-09-13 2009-12-09 ティーオーエー株式会社 Wireless microphone device
US7414587B2 (en) 2006-09-25 2008-08-19 Shure Acquisition Holdings, Inc. Antenna in a wireless system
CA2706695C (en) 2006-12-04 2019-04-30 Lynx System Developers, Inc. Autonomous systems and methods for still and moving picture production
US8144892B2 (en) 2006-12-18 2012-03-27 The Sapling Company, Inc. Of Huntingdon Valley, Pa. Audio amplification system
US20080165250A1 (en) 2007-01-08 2008-07-10 Jeff Kirk Ekdahl Vehicle security surveillance system
US9143009B2 (en) 2007-02-01 2015-09-22 The Chamberlain Group, Inc. Method and apparatus to facilitate providing power to remote peripheral devices for use with a movable barrier operator system
US7870076B2 (en) 2007-02-27 2011-01-11 Red Hat, Inc. Method and an apparatus to provide interoperability between different protection schemes
JP2008258970A (en) 2007-04-05 2008-10-23 Sony Corp Wireless audio transmission system, wireless microphone, audio transmitter, audio receiver, imaging device, recorder, and audio mixer
US20090017881A1 (en) 2007-07-10 2009-01-15 David Madrigal Storage and activation of mobile phone components
WO2009009777A1 (en) 2007-07-12 2009-01-15 Bae Systems Information And Electronic Systems Integration Inc. Spectrum sensing function for cognitive radio applications
US8121306B2 (en) 2007-08-17 2012-02-21 Enforcement Video, Llc Range-sensitive wireless microphone with out-of-range recording feature
US20090076636A1 (en) 2007-09-13 2009-03-19 Bionica Corporation Method of enhancing sound for hearing impaired individuals
US20090074216A1 (en) 2007-09-13 2009-03-19 Bionica Corporation Assistive listening system with programmable hearing aid and wireless handheld programmable digital signal processing device
US8660055B2 (en) 2007-10-31 2014-02-25 Bose Corporation Pseudo hub-and-spoke wireless audio network
US8208024B2 (en) 2007-11-30 2012-06-26 Target Brands, Inc. Communication and surveillance system
ES2443918T5 (en) 2007-12-27 2017-06-06 Oticon A/S Hearing device and procedure for receiving and / or sending wireless data
FR2926375B1 (en) 2008-01-11 2010-02-12 Airbus France METHOD FOR PERFORMING COMPUTER APPLICATION, KIT AND AIRCRAFT
JP2009169922A (en) 2008-01-17 2009-07-30 Watanabe Hideki Usb self-environment starting memory (usb-ses memory)
US8228364B2 (en) 2008-01-29 2012-07-24 Enforcement Video, Llc Omnidirectional camera for use in police car event recording
US20090195651A1 (en) 2008-01-31 2009-08-06 Leonard Robert C Method of providing safety enforcement for school buses
US20090213902A1 (en) 2008-02-27 2009-08-27 Ming-Fure Jeng Automatic frequency hopping and channel tracking system for auto-frequency-hopping digital wireless microphone
FI20085280A0 (en) 2008-04-03 2008-04-03 Polar Electro Oy Communication between handheld and matching device
SE534099C2 (en) 2008-06-02 2011-04-26 Klaus Drosch Device for data protection
WO2010002921A1 (en) 2008-07-01 2010-01-07 Yoostar Entertainment Group, Inc. Interactive systems and methods for video compositing
CN101309088B (en) 2008-07-04 2011-11-30 华中科技大学 Random addressing self-adapting frequency-hopping wireless microphone and receiving machine thereof
US8145134B2 (en) 2008-07-22 2012-03-27 At&T Intellectual Property I, L.P. Wireless microphone beacon
US8712362B2 (en) 2008-07-26 2014-04-29 Enforcement Video, Llc Method and system of extending battery life of a wireless microphone unit
EP2150057A3 (en) 2008-07-29 2013-12-11 Gerald Curry Camera-based tracking and position determination for sporting events
US8166220B2 (en) * 2008-08-04 2012-04-24 Sandisk Il Ltd. Device for connection with a storage device and a host
US8422944B2 (en) 2008-08-12 2013-04-16 Sony Corporation Personal function pad
GB2463277B (en) 2008-09-05 2010-09-08 Sony Comp Entertainment Europe Wireless communication system
US8174577B2 (en) 2008-09-11 2012-05-08 Tech-Cast Mfg. Corp. Automatic in-car video recording apparatus for recording driving conditions inside and outside a car
US8260217B2 (en) 2008-10-30 2012-09-04 Taiwan Gomet Technology Co., Ltd. Bidirectional wireless microphone system with automatic login function
TWI383319B (en) 2008-11-26 2013-01-21 Via Tech Inc Computer system and booting method of same
US8713209B2 (en) 2009-01-13 2014-04-29 Qualcomm Incorporated System, apparatus, and method for fast startup of USB devices
US8744087B2 (en) 2009-02-09 2014-06-03 Revo Labs, Inc. Wireless multi-user audio system
US8311983B2 (en) 2009-04-28 2012-11-13 Whp Workflow Solutions, Llc Correlated media for distributed sources
US20100289648A1 (en) 2009-05-13 2010-11-18 Bradley Richard Ree Wireless Microphone with External Enable
US20100302979A1 (en) 2009-05-28 2010-12-02 Nokia Corporation Power saving in wireless communication
US8254844B2 (en) 2009-05-29 2012-08-28 Motorola Solutions, Inc. Method and apparatus for utilizing a transmission polarization to reduce interference with a primary incumbent signal
KR100926165B1 (en) 2009-08-18 2009-11-10 (주)애니쿼터스 Automatic transmission device and method of mobile terminal with one-shot CAL, one-shot SMS, one-shot Internet connection function through NFC controller
US8780199B2 (en) 2009-09-20 2014-07-15 Tibet MIMAR Networked security camera with local storage and continuous recording loop
US20110142156A1 (en) 2009-12-15 2011-06-16 Sony Ericsson Mobile Communications Ab Multi-channel signaling
DK2534853T3 (en) 2010-02-12 2017-02-13 Sonova Ag WIRELESS SOUND TRANSMISSION SYSTEM AND METHOD / WIRELESS SOUND TRANSMISSION SYSTEM AND METHOD
US20120310395A1 (en) 2010-02-12 2012-12-06 Phonak Ag Wireless sound transmission system and method using improved frequency hopping and power saving mode
US8485404B2 (en) 2010-03-29 2013-07-16 My Innoventure, LLC Cases and covers for handheld electronic devices
US9112989B2 (en) 2010-04-08 2015-08-18 Qualcomm Incorporated System and method of smart audio logging for mobile devices
US9386116B2 (en) 2010-05-13 2016-07-05 Futurewei Technologies, Inc. System, apparatus for content delivery for internet traffic and methods thereof
US8792589B2 (en) 2010-05-13 2014-07-29 Wi-Lan Inc. System and method for protecting transmissions of wireless microphones operating in television band white space
US8670380B2 (en) 2010-06-08 2014-03-11 Audio Technica, U.S., Inc Distributed reception wireless microphone system
FR2962231B1 (en) 2010-07-02 2014-10-31 Alcatel Lucent METHOD FOR ALIGNING AND FIXING AN OPTICAL FIBER COUPLED WITH AN OPTOELECTRONIC COMPONENT
US8380131B2 (en) 2010-07-20 2013-02-19 Albert Chiang Frequency selection system including a wireless microphone and a receiver
JP5608484B2 (en) 2010-09-06 2014-10-15 株式会社リョーイン Storage device and network connection setting method
KR20130101540A (en) 2010-10-11 2013-09-13 인터디지탈 패튼 홀딩스, 인크 Method and apparatus for bandwidth allocation for cognitive radio networks
US8707392B2 (en) 2010-10-15 2014-04-22 Roche Diagnostics Operations, Inc. Systems and methods for disease management
US8497940B2 (en) 2010-11-16 2013-07-30 Audio-Technica U.S., Inc. High density wireless system
US8831677B2 (en) 2010-11-17 2014-09-09 Antony-Euclid C. Villa-Real Customer-controlled instant-response anti-fraud/anti-identity theft devices (with true-personal identity verification), method and systems for secured global applications in personal/business e-banking, e-commerce, e-medical/health insurance checker, e-education/research/invention, e-disaster advisor, e-immigration, e-airport/aircraft security, e-military/e-law enforcement, with or without NFC component and system, with cellular/satellite phone/internet/multi-media functions
JP5498366B2 (en) 2010-12-10 2014-05-21 Toa株式会社 Wireless microphone system
WO2012100114A2 (en) 2011-01-20 2012-07-26 Kogeto Inc. Multiple viewpoint electronic media system
US9276667B2 (en) 2011-02-22 2016-03-01 Revolabs, Inc. Systems and methods for wireless audio conferencing
US8489065B2 (en) 2011-05-03 2013-07-16 Robert M Green Mobile device controller application for any security system
KR101703931B1 (en) * 2011-05-24 2017-02-07 한화테크윈 주식회사 Surveillance system
US20120307070A1 (en) 2011-06-02 2012-12-06 James Pierce Surveillance method utilizing video compression for wireless transmission
CN102355618A (en) 2011-07-28 2012-02-15 刘喆 Modulated wave wireless microphone
CN103718570B (en) 2011-08-09 2016-09-07 索诺瓦公司 Wireless voice transmission system and method
US10129211B2 (en) 2011-09-15 2018-11-13 Stephan HEATH Methods and/or systems for an online and/or mobile privacy and/or security encryption technologies used in cloud computing with the combination of data mining and/or encryption of user's personal data and/or location data for marketing of internet posted promotions, social messaging or offers using multiple devices, browsers, operating systems, networks, fiber optic communications, multichannel platforms
US8630908B2 (en) 2011-11-02 2014-01-14 Avery Dennison Corporation Distributed point of sale, electronic article surveillance, and product information system, apparatus and method
WO2013074947A2 (en) 2011-11-18 2013-05-23 Rubriq Corporation Method and apparatus for enabling recipient interaction with a content stream
EP2813121A2 (en) 2012-01-11 2014-12-17 Interdigital Patent Holdings, Inc. Adaptive control channel
DK2805464T3 (en) 2012-01-20 2016-07-04 Sonova Ag Wireless audio transmission and method
EP2823683A1 (en) 2012-02-03 2015-01-14 Interdigital Patent Holdings, Inc. Method and apparatus for coexistence among wireless transmit/receive units (wtrus) operating in the same spectrum
TW201336324A (en) 2012-02-23 2013-09-01 Taiwan Carol Electronics Co Ltd Wireless microphone frequency matching system and method
WO2013150326A1 (en) 2012-04-03 2013-10-10 Gvr Trade Sa Passive wireless microphone
WO2013192312A2 (en) 2012-06-19 2013-12-27 IPA (Cayman) Limited Secure digital remediation systems and methods for managing an online reputation
US9729590B2 (en) 2012-06-19 2017-08-08 Bridg-It Llc Digital communication and monitoring system and method designed for school communities
US20140078304A1 (en) 2012-09-20 2014-03-20 Cloudcar, Inc. Collection and use of captured vehicle data
US10272848B2 (en) 2012-09-28 2019-04-30 Digital Ally, Inc. Mobile video and imaging system
CN102932703A (en) 2012-10-17 2013-02-13 杨志豪 Wireless microphone
WO2014081711A1 (en) * 2012-11-20 2014-05-30 Utility Associates, Inc System and method for securely distributing legal evidence
CN202957973U (en) 2012-12-25 2013-05-29 周芸 High-performance amplitude-modulation wireless microphone
US9787669B2 (en) 2013-03-14 2017-10-10 Comcast Cable Communications, Llc Identity authentication using credentials
US20150032535A1 (en) 2013-07-25 2015-01-29 Yahoo! Inc. System and method for content based social recommendations and monetization thereof
US9098956B2 (en) * 2013-09-26 2015-08-04 Lytx, Inc. Dynamic uploading protocol
CN103617005A (en) 2013-11-28 2014-03-05 中国联合网络通信集团有限公司 Access method, device and system for intelligent card

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180091781A1 (en) * 2009-11-09 2018-03-29 Verint Americas Inc. Method and apparatus to transmit video data
US20170048556A1 (en) * 2014-03-07 2017-02-16 Dean Drako Content-driven surveillance image storage optimization apparatus and method of operation
US10412420B2 (en) * 2014-03-07 2019-09-10 Eagle Eye Networks, Inc. Content-driven surveillance image storage optimization apparatus and method of operation
US20160351030A1 (en) * 2015-06-01 2016-12-01 Securonet Virtual safety network
US10165171B2 (en) 2016-01-22 2018-12-25 Coban Technologies, Inc. Systems, apparatuses, and methods for controlling audiovisual apparatuses
US11386141B1 (en) * 2016-01-25 2022-07-12 Kelline ASBJORNSEN Multimedia management system (MMS)
US11037604B2 (en) 2016-04-06 2021-06-15 Idemia Identity & Security Germany Ag Method for video investigation
EP3229174A1 (en) * 2016-04-06 2017-10-11 L-1 Identity Solutions AG Method for video investigation
US10339970B2 (en) * 2016-04-26 2019-07-02 Idis Co., Ltd. Video recording apparatus with pre-event circulation recording function
US10152858B2 (en) 2016-05-09 2018-12-11 Coban Technologies, Inc. Systems, apparatuses and methods for triggering actions based on data capture and characterization
US10152859B2 (en) 2016-05-09 2018-12-11 Coban Technologies, Inc. Systems, apparatuses and methods for multiplexing and synchronizing audio recordings
US10370102B2 (en) 2016-05-09 2019-08-06 Coban Technologies, Inc. Systems, apparatuses and methods for unmanned aerial vehicle
US10789840B2 (en) 2016-05-09 2020-09-29 Coban Technologies, Inc. Systems, apparatuses and methods for detecting driving behavior and triggering actions based on detected driving behavior
US11116033B2 (en) * 2016-07-11 2021-09-07 Motorola Solutions, Inc. Method and apparatus for disassociating from a network
US11210330B2 (en) * 2016-07-13 2021-12-28 Hangzhou Hikvision Digital Technology Co., Ltd. Method and apparatus for storing, reading, and displaying plurality of multimedia files
US10248335B2 (en) * 2016-07-14 2019-04-02 International Business Machines Corporation Reducing a size of backup data in storage
US20180131898A1 (en) * 2016-08-24 2018-05-10 WHP Worflow Solutions, LLC Portable recording device multimedia classification system
US11516427B2 (en) * 2016-08-24 2022-11-29 Getac Technology Corporation Portable recording device for real-time multimedia streams
US10511801B2 (en) * 2016-08-24 2019-12-17 Whp Workflow Solutions, Inc. Portable recording device multimedia classification system
US10958868B2 (en) 2016-08-24 2021-03-23 Getac Technology Corporation Portable recording device multimedia classification system
US20230083330A1 (en) * 2016-08-24 2023-03-16 Getac Technology Corporation Incident report generation from multimedia data capture
US12192672B2 (en) * 2016-08-24 2025-01-07 Getac Technology Corporation Incident report generation from multimedia data capture
US20210406556A1 (en) * 2017-01-26 2021-12-30 Matias Klein Total Property Intelligence System
US12118874B2 (en) * 2017-01-26 2024-10-15 Matias Klein Total property intelligence system
CN107241583A (en) * 2017-07-28 2017-10-10 中国电信股份有限公司广东号百信息服务分公司 A kind of vehicle-mounted cloud resource intelligent dispatching system and method
US10791120B2 (en) * 2017-08-30 2020-09-29 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. System and method for providing access to secured data via a push notification
US20190068605A1 (en) * 2017-08-30 2019-02-28 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. System and method for providing access to secured data via a push notification
US11042305B2 (en) 2017-11-06 2021-06-22 Toshiba Memory Corporation Memory system and method for controlling nonvolatile memory
US11747989B2 (en) 2017-11-06 2023-09-05 Kioxia Corporation Memory system and method for controlling nonvolatile memory
US20200036945A1 (en) * 2018-07-24 2020-01-30 Comcast Cable Communications, Llc Neighborhood Proximity Media Capture For Life Safety Alarm Events
US20200117755A1 (en) * 2018-10-12 2020-04-16 International Business Machines Corporation Intelligent video bridge for a closed circuit television system
US11501391B2 (en) 2018-12-20 2022-11-15 Motorola Solutions, Inc. Method and operation of a portable device and a cloud server for preserving the chain of custody for digital evidence
US11343472B2 (en) 2020-03-17 2022-05-24 Axis Ab Associating captured media to a party
US11785266B2 (en) 2022-01-07 2023-10-10 Getac Technology Corporation Incident category selection optimization
US12238347B2 (en) 2022-01-07 2025-02-25 Getac Technology Corporation Incident category selection optimization

Also Published As

Publication number Publication date
US20160064036A1 (en) 2016-03-03
US9225527B1 (en) 2015-12-29

Similar Documents

Publication Publication Date Title
US20160062992A1 (en) Shared server methods and systems for information storage, access, and security
US20170200476A1 (en) Systems, apparatuses and methods for facilitating access to and distribution of audiovisual files by use of modified audiovisual files
US20220004274A1 (en) Systems and Methods for Bulk Redaction of Recorded Data
US9307317B2 (en) Wireless programmable microphone apparatus and system for integrated surveillance system devices
US20160065908A1 (en) Portable camera apparatus and system for integrated surveillance system devices
US12056783B2 (en) Systems and methods for processing recorded data for storage using computer-aided dispatch information
WO2016033523A1 (en) Compact multi-function dvr with multiple integrated wireless data communication devices
US11995734B2 (en) Auditing recorded data from a recording device
US10848717B2 (en) Systems and methods for generating an audit trail for auditable devices
KR20150080058A (en) Video sharing system and method of black box for vehicle
US20080320043A1 (en) OfficerAssist
CN220874562U (en) Police key personnel monitoring and analyzing big data device
TWI536328B (en) Traffic image division of labor identification system
EP3323242A1 (en) Systems and methods for generating an audit trail for auditable devices
CA3062846A1 (en) Mobile digital video and data recording system

Legal Events

Date Code Title Description
AS Assignment

Owner name: COBAN TECHNOLOGIES, INC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, ALLAN;TAN, YUN LONG;REEL/FRAME:035406/0687

Effective date: 20141112

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION