EP3207501A1 - Forensic video recording with presence detection - Google Patents
Forensic video recording with presence detectionInfo
- Publication number
- EP3207501A1 EP3207501A1 EP15850436.5A EP15850436A EP3207501A1 EP 3207501 A1 EP3207501 A1 EP 3207501A1 EP 15850436 A EP15850436 A EP 15850436A EP 3207501 A1 EP3207501 A1 EP 3207501A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- law enforcement
- image data
- event
- proximity tag
- recording device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/74—Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems
Definitions
- Embodiments and/or features of the invention described in the present document may be used with the subject matter disclosed in commonly assigned U.S. Patent No. 8,781 ,292, filed September 27, 2013, issued July 15, 2014, and entitled "COMPUTER PROGRAM, METHOD, AND SYSTEM FOR MANAGING MULTIPLE DATA RECORDING DEVICES" (“the '292 Patent”), which is a continuation application of the ⁇ 51 Application.
- the '292 Patent is hereby incorporated by reference in its entirety into the present application.
- Embodiments and/or features of the invention described in the present document may be used with the subject matter disclosed in commonly assigned U.S. Patent Application No. 14/040,329, filed September 27, 2013, and entitled “PORTABLE VIDEO AND IMAGING SYSTEM” ("the '329 Application”); and commonly assigned U.S. Patent Application No. 14/040,006, filed September 27, 2013, and entitled “MOBILE VIDEO AND IMAGING SYSTEM” (“the ⁇ 06 Application”).
- the '329 Application and the '006 Application are hereby incorporated by reference in their entirety into the present application.
- Embodiments of the invention generally relate to augmenting video data with presence data derived from one or more proximity tags. More specifically, embodiments of the invention generate forensically authenticated recordings linking video imagery to the presence of specific objects in or near the recording. 2. RELATED ART
- Video recordings of law enforcement activities are becoming more common and more frequently used in legal proceedings.
- it is presently a weakness of such systems that the recordings they produce cannot be verifiably traced back to a specific recording device or authenticated as unaltered.
- specific objects appearing in the scene cannot be identified. This can be particularly significant when, for example, calibration or identification records for a device need to be produced.
- a first embodiment of the invention includes a video recording system comprising a camera, a wireless proximity tag reader, a storage memory, and control circuitry operable to receive image data from the camera, receive a proximity tag identifier identifying a proximity tag from the proximity tag reader, and store an encoded frame containing the image data and the proximity tag identity in the storage memory.
- a method of recording authenticated video with presence data comprises the steps of creating an augmented encoded frame by encoding video data into an encoded frame, receiving one or more proximity tag identifiers from a proximity tag reader, including the received proximity tag identifiers as metadata for the encoded frame to produce the augmented encoded frame, generating a digital signature for the augmented encoded frame, and storing the augmented encoded frame and digital signature.
- a computer-readable medium storing an augmented video file is disclosed, the video file comprising a plurality of augmented frames, each augmented frame of the plurality of augmented frames including video data and one or more identifiers each associated with a proximity tag.
- FIG. 1 depicts an exemplary hardware platform of certain embodiments of the invention
- FIG. 2 depicts a system diagram illustrating the components of one embodiment of the invention
- FIG. 3 depicts a flowchart illustrating the operation of one embodiment of the invention.
- FIGs. 4A and 4B depict illustrative video files produced in accordance with one embodiment of the present invention.
- the drawing figures do not limit the invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
- Embodiments of the invention may be embodied as, among other things a method, system, or set of instructions embodied on one or more computer-readable media.
- Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database.
- Computer-readable media include (but are not limited to) RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data temporarily or permanently.
- computer-readable media should not be construed to include physical, but transitory, forms of signal transmission such as radio broadcasts, electrical signals through a wire, or light pulses through a fiber-optic cable. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations.
- volatile storage media such as RAM may retain data only as long as it is powered, while non-volatile media such as flash memory retain data even when powered off.
- non-volatile media such as flash memory retain data even when powered off.
- some forms of computer storage media are write-once, read many (WORM), such that data can be stored to them but not erased or overwritten.
- WORM write-once, read many
- WORM media data can be recorded in multiple sessions, where the data from one session is appended to the data from the previous session.
- Other forms of media may be indefinitely rewriteable.
- Some forms of media may be encrypted, such that data is written to them encrypted by an encryption key (which can correspond to the device, the user, or be unique in some other way) and data read from them is scrambled unless decrypted with the corresponding decryption key.
- storage media can be made tamper-resistant such that it is difficult or impossible to alter or erase data stored to them, or to prevent reading data except by authorized means.
- WORM media or encrypted media as described above are one way to make storage media tamper resistant. Another way is to make storage media physically difficult to remove, such as by covering them with epoxy after they have been installed. Other methods of making storage resistant tamper resistant are also known in the art and can be used.
- a first broad class of embodiments of the invention includes a video recording system comprising a camera, a wireless proximity tag reader, a storage memory, and control circuitry operable to receive image data from the camera, receive a proximity tag identifier identifying a proximity tag from the proximity tag reader, and store an encoded frame containing the image data and the proximity tag identity in the storage memory.
- Computer 102 can be a desktop computer, a laptop computer, a server computer, a mobile device such as a smartphone or tablet, or any other form factor of general- or special-purpose computing device.
- Computer 102 Depicted with computer 102 are several components, for illustrative purposes. In some embodiments, certain components may be arranged differently or absent. Additional components may also be present. Included in computer 102 is system bus 104, whereby other components of computer 102 can communicate with each other. In certain embodiments, there may be multiple busses or components may communicate with each other directly. Connected to system bus 104 is central processing unit (CPU) 106. Also attached to system bus 104 are one or more random-access memory (RAM) modules.
- CPU central processing unit
- RAM random-access memory
- graphics card 1 10 Also attached to system bus 104 is graphics card 1 10.
- graphics card 104 may not be a physically separate card, but rather may be integrated into the motherboard or the CPU 106.
- graphics card 1 10 has a separate graphics-processing unit (GPU) 1 12, which can be used for graphics processing or for general purpose computing (GPGPU).
- GPU 1 12 may be used for encoding, decoding, transcoding, or compositing video.
- GPU memory 1 14 is Also on graphics card 1 10 .
- display 1 16 Connected (directly or indirectly) to graphics card 1 10 is display 1 16 for user interaction. In some embodiments no display is present, while in others it is integrated into computer 102.
- peripherals such as keyboard 1 18 and mouse 120 are connected to system bus 104. Like display 1 16, these peripherals may be integrated into computer 102 or absent.
- local storage 122 which may be any form of computer-readable media, and may be internally installed in computer 102 or externally and removeably attached.
- NIC network interface card
- NIC 124 is also attached to system bus 104 and allows computer 102 to communicate over a network such as network 126.
- NIC 124 can be any form of network interface known in the art, such as Ethernet, ATM, fiber, or Wi-Fi (i.e., the IEEE 802.1 1 family of standards).
- NIC 124 connects computer 102 to local network 126, which may also include one or more other computers, such as computer 128, and network storage, such as data store 130.
- Local network 126 is in turn connected to Internet 132, which connects many networks such as local network 126, remote network 134 or directly attached computers such as computer 136.
- computer 102 can itself be directly connected to Internet 132.
- FIG. 2 a system diagram illustrating the components of one embodiment of the invention is depicted; the depicted system is generally referred to by reference numeral 200.
- one component is camera 202, which captures imagery of a scene including one or more objects.
- camera 202 is a still camera.
- camera 202 is a video camera.
- a plurality of cameras may be simultaneously capturing imagery of the scene.
- Camera 202 may be a visible-spectrum camera, an infrared camera, a millimeter-wave camera, a low-light camera, or any other form of imaging device now known in the art or hereafter developed.
- Camera 202 may also include a microphone to capture audio along with video data.
- the objects in the scene include a suspect 204, a police officer 206, and a patrol cruiser 208.
- Some objects in the scene may be tagged with proximity tags.
- officer 206 is wearing a badge that includes proximity tag 210, and patrol cruiser 208 may have an integrated proximity tag 212.
- a proximity tag associated with a user is embedded in a credit card-sized proximity card that can be carried in the user's wallet.
- objects not visible in the scene may also have proximity tags.
- a second officer standing out of view may have a proximity tag, or officer 206 may have a service weapon that is not visible in the scene but has a proximity tag.
- Other objects in the scene, such as suspect 204 may not be tagged, or may start untagged and later become tagged. In this exemplary scenario, suspect 204 could become tagged by being restrained with handcuffs containing a proximity tag.
- Any object that may need to be specifically identified if it appears in recorded image data can be tagged with a proximity tag.
- taggable objects in a law enforcement scenario include badges, service weapons, canine units, patrol cruisers, forensic kits, breath analyzers, radar and lidar guns, and evidence bags. It will be apparent to a person of skill in the art that the objects to be tagged will be different depending on the scenario.
- a proximity tag such as proximity tag 210 and 212 is any device that radiates an identifying signal, herein referred to as the proximity tag identifier, that can be read by a corresponding reader such as proximity tag reader 214.
- Proximity tags can be active (meaning that they periodically broadcast their identifier), assisted passive (meaning that they broadcast their identifier only when interrogated by a signal from the reader), or passive (meaning that they have no power source and must be illuminated by a signal from the proximity tag reader in order to radiate their identifier). Other forms of proximity tags are also possible.
- Proximity tag identifiers may be preprogrammed into proximity tags, or may be field-programmable, such that the user assigns the identifier when the proximity tag is deployed.
- proximity tag system is the radio-frequency identification (RFID) tag and the corresponding RFID reader.
- RFID radio-frequency identification
- Another form of proximity tag system utilizes a challenge-response protocol to avoid the spoofing of a proximity tag identifier.
- Any form of proximity tag now known or hereafter developed, can be used.
- Proximity tag reader 214 receives the proximity tag identifiers transmitted by proximity tags such as proximity tags 210 and 212. Depending on the type of proximity tag, a different type of reader may be required to receive the proximity tag identifiers. For example, an active reader is required to read passive tags. In some embodiments, proximity tag reader can determine the distance to the transmitting tag based on signal strength or other information. In some embodiments, multiple proximity tag readers are present. In some such implementations, positional information about the tag can be determined based on relative signal strength at each reader.
- sensors 216 collect or receive data to supplement the audiovisual data provided by camera 202.
- sensors include additional microphones for recording supplementary audio data, additional clocks for providing time data, a radio receiver for recording radio transmissions, a global-positioning system (GPS) receiver for recording position data, a breath analyzer for detecting intoxication, a fingerprint reader for logging individual identity, and one or more accelerometers for recording movement and acceleration data.
- Additional sensors such as a holster event sensor for detecting when a holster cover is opened or when a weapon is removed from the holster, may be directly or wirelessly connected.
- raw image data from camera 202 is first processed by encoder 218.
- Raw image data may be encoded by any still image or video codec now known in the art or developed in the future.
- many image and video file container formats provide for the addition of metadata to the image or video data. Where such provision is not made, metadata can be stored in an auxiliary file and optionally linked to specific video frames or still images using timestamp data, including a time when the image data was acquired, a filename for where the auxiliary data is stored, or similar.
- Combiner 220 combines encoded video data from encoder 218 and proximity tag identity data received from proximity tag reader 220, resulting in an augmented encoded frame.
- the identities of multiple proximity tags are added to a single augmented encoded frame.
- data from sensors 216 is also added to the augmented encoded frame.
- not every frame is augmented. For example, when encoding MPEG data, identity tag data may only be included with l-frames. In other embodiments, identity tag data may be included in any frame where the set of detected tags changes. In some embodiments, data relating to the signal strength associated with each tag may also be included with the tag identity.
- augmented encoded frames are further digitally signed by the recording device to verify that they have not been tampered with. To do this, some embodiments use a device-specific key stored in memory 222. In some such embodiments, memory 222 is tamper resistant, such that it is difficult or impossible to extract the device-specific key.
- the digital signature can be generated by signer 224 using any algorithm for producing digital signatures now known in the art or later developed, including public-key schemes such as the Digital Signature Algorithm (DSA) and keyed hash schemes such as the Hash-based Message Authentication Code (HMAC).
- DSA Digital Signature Algorithm
- HMAC Hash-based Message Authentication Code
- individual images or frames will be signed. In other embodiments, entire video files are signed. In still other embodiments, groups of frames are collectively signed. In yet other embodiments, only those frames that contain identity tag data are signed.
- the digital signatures are stored as metadata with the encoded frame. In other embodiments, the digital signatures are stored as metadata with the container file. In still other embodiments, an auxiliary file containing a detached signature can be generated, potentially with timestamps corresponding to the frames that are signed.
- augmented frames together with non-augmented frames when they are present, are combined into a container file format by combiner 226 to produce a final video file.
- the multiple files containing image data, proximity tag identity data, sensor data and signature data may all be combined into a single file by combiner 226.
- combiner 226 may be integrated into encoder 226.
- combiner 226 may re-encode, transcode, or compress data for efficient storage.
- a display such as display 234 may also be present in system 200 for viewing the final video data in real time.
- display 234 may be configured to overlay data such as sensor data and/or proximity tag identity data onto the video data.
- display 234 takes the form of a heads-up display that overlays sensor data and/or proximity tag data on the scene as viewed by the user.
- the video file produced (in this embodiment) by combiner 226 is stored in data store 228 for subsequent retrieval.
- data store 228 takes the form of tamper-proof storage so that the video data, once stored, cannot be deleted or altered.
- data store 228 is a remote data store to which data can be uploaded at the end of a shift or in real time.
- a computer such as computer 230 may be able to access the video file in real time for display on display 232 to a remote viewer such as a dispatcher.
- a method of recording authenticated video with presence data comprising the steps of creating an augmented encoded frame by encoding video data into an encoded frame, receiving one or more proximity tag identifiers from a proximity tag reader, including the received proximity tag identifiers as metadata for the encoded frame to produce the augmented encoded frame, generating a digital signature for the augmented encoded frame, and storing the augmented encoded frame and digital signature.
- FIG. 3 a flowchart illustrating the operation of one embodiment of the present invention is depicted. The method of creating augmented video may be initiated manually, or by a local or remote sensor-related triggering event.
- Such a sensor-related triggering event may be generated directly by the sensor, or by a recording device manager, such as a Digital Ally ⁇ VuLink®, that controls and synchronizes various recording devices.
- the recording device manager may communicate (via wireless communication, wired communication, or both) to sensors such as described herein, one or more person-mounted camera units, a vehicle-mounted video camera oriented to observe events external to the vehicle, a vehicle-mounted video camera oriented to observe events internal to the vehicle, and/or one or more storage elements.
- the recording device manager detects when one video camera begins recording, and then instructs all other associated devices to begin recording.
- the recording device manager may also send information indicative of a time stamp to the various recording devices for corroborating the recorded data.
- the recording device manager may instruct ail associated video cameras to begin recording upon the receipt of a signal from a sensor such as a breath analyzer that a breath analysis has begun. This ensures that multiple video cameras record the breath analysis, for future authentication that the breath analysis was performed correctly.
- the recording device manager may also send a time stamp to all the associated video cameras to provide a corroboration of the various recorded data.
- an officer wearing a badge with an embedded proximity tag enters a patrol car. Detection of this proximity tag by the recording device manager serves as a triggering event and causes cameras in the patrol car and on the officer's body to begin recording.
- Proximity tag identifiers associated with both the officer and the patrol car are stored as metadata with both recordings. In this way, not only are the officer and the patrol car associated with each video recording, but the common proximity tag identifier data allows the two video recordings to be associated with each other as well.
- the method begins at step 302, when image data is received.
- the image data is a single frame of raw video data received from a video camera.
- it is a pre-encoded frame or frames from a video camera.
- it is an image received from a still camera.
- it is image data in any combination of the forms above from a plurality of cameras.
- step 304 the image data received is encoded into a frame.
- Any image-coding algorithm now known in the art or hereafter developed can be employed for this encoding step. This process can also involve decoding, transcoding, or recoding the image data. Furthermore, if the image data was received in encoded form, no additional encoding may be necessary at this step. In embodiments where image data is received from multiple cameras, this step may also involve synchronizing, merging, and/or combining the image data from the multiple cameras. In some embodiments, image data from multiple cameras may be combined into a single augmented frame; in other embodiments, image data from multiple video cameras is stored in separate video data files. In still other embodiments, features of video codecs and container files allowing multiple camera angles of a single scene may be used to store separate frames containing video data from each of the cameras in the same video data file.
- proximity tag identity data for one or more proximity tags is received from a proximity reader.
- signal strength information or other information relating to the position of the tag relative to the reader is also received.
- data is received from a plurality of tag readers. In such embodiments, different tag readers may read tag identity data for some or all of the same proximity tags, or a tag may be read by only one reader of the plurality of tag readers.
- step 308 supplemental data from a sensor such as sensor 216 is received.
- data is received from more than one such sensor.
- data may be received every frame, for a subset of frames, or for only a single frame.
- Different sensors may provide data at different rates and different times, so that successive frames may have different sets of supplemental data available.
- the user triggering the siren and/or light bar of a patrol cruiser might be a triggering event.
- a velocity or acceleration reading either from the cruiser or from integrated velocity and/or acceleration sensors may be a triggering event.
- a vehicle crash, detected by an accelerometer reading, airbag deployment, or similar stimulus might be a trigger event.
- a positional reading could be a triggering event. Such a positional reading could be absolute (for example, entering or exiting a particular geofenced area) or relative (for example, moving more than a particular distance from a patrol cruiser or other fixed or mobile point of reference).
- a further example of a triggering event is an indication from a sensor configured to detect when a holster cover is opened or when a weapon is removed from the holster.
- Another form of user-related triggering event could come in the form of one or more biometric stress indications (such as elevated heart rate, blood pressure respiration, etc.) obtained from biometric sensors worn by the user.
- biometric stress indications such as elevated heart rate, blood pressure respiration, etc.
- audio data could generate triggering events if raised voices or high levels of vocal stress are detected.
- Triggering events can also come from the context of the data being collected. For example, when encoder 218 detects that the video data it is processing contains a face, a triggering event could be generated. Alternately, this functionality could be limited to the recognition of a particular face (for example, if the user sees a face matching a photograph provided with a warrant, or on a wanted poster, a trigger event could be generated). Similar recognition algorithms can be applied to other data streams as well; for example, the audio signature of a gunshot could be a triggering event, or the positional signature of evasive maneuvering.
- triggering signal can be generated manually by the user or, in embodiments where data is streamed to a remote date store, by a remote observer.
- a person of skill in the art will recognize that a wide variety of triggering signals are possible and that variations and combinations of the above will become apparent.
- step 312 If the determination is made to augment the frame, processing proceeds to step 312; otherwise, processing continues at decision 314.
- step 312 some or all of the proximity tag data and supplementary sensor data is stored as metadata for the encoded frame, creating an augmented encoded frame.
- sensor data and proximity information can be stored directly with the encoded frame.
- a determination of whether to authenticate the (augmented or non-augmented) frame is made. This decision can be based on the same factors as the determination of whether to augment a frame; however, if the computational cost of calculating a digital signature is high, it may not be feasible to calculate a digital authentication for every frame.
- only some of the frames are digitally signed.
- sets of frames are digitally signed as a group.
- an entire video file is signed as a single unit. If the determination is made to generate a digital signature, processing proceeds to step 316.
- the digital signature is generated. As discussed above, any digital signature or authentication code algorithm can be used to produce these digital signatures.
- the digital signature is stored. As with the metadata discussed above, digital signatures can be stored with the corresponding frame, with the corresponding video file, or in a separate file. In some embodiments, the digital signature is only calculated for the encoded frame; in other embodiments, the digital signature is calculated for the encoded frame together with any metadata relating to it.
- the frame is stored in the container video file at step 320.
- Any container file format can be used.
- the encoded frames are encoded using the MPEG4 Part 10 codec, which is commonly referred to as H.264 codec, a variety of container files including 3GP, DivX, MP4, and MPEG Program Stream can be used. Different video codecs can also use these container file formats or a variety of others.
- an archive file format such as ZIP may be used.
- nested container file formats can be used, such as a ZIP file containing an MP4 file containing video, an XML file containing proximity tag and sensor data, and a PGP signature file containing a detached signature.
- the container file is written to memory (such as data store 228).
- the container file is not written out until video recording completes.
- the video file is progressively written out as additional frames are generated.
- intermediate container files are generated and stored, and a final "closed" container is written out once recording terminates.
- a computer-readable medium storing an augmented video file comprising a plurality of augmented frames, each augmented frame of the plurality of augmented frames including video data and one or more identifiers each associated with a proximity tag.
- FIGs. 4A and 4B two embodiments of an augmented video file are depicted.
- an augmented video file 402 that stores proximity tag data, sensor data, and digital signature data as metadata for the associated frame is depicted.
- Frame 404 is an augmented frame, including metadata fields for proximity tag data 406, sensor data 408, and digital signature 410, which authenticates proximity tag data 406, sensor data 408, and image 412. Also included in frame 404 is the actual image data 412.
- frame 414 which is an augmented but non- authenticated frame. Accordingly, there are metadata fields including proximity tag data 416 and sensor data 418 in addition to image data 420.
- frame 422 which is a non-augmented, non-authenticated frame. Accordingly, only image data 424 is present (in addition to whatever metadata fields are ordinarily generated in the process of encoding).
- FIG. 4B depicts an augmented video file 426 where all metadata is stored in auxiliary files.
- augmented video file 426 is a ZIP file containing video data file 428, which is in MP4 format, proximity tag and sensor data file 430, which is in XML format, and digital signature file 432, containing ASCII-encoded PGP signatures.
- video data file 428 which is in MP4 format
- proximity tag and sensor data file 430 which is in XML format
- digital signature file 432 containing ASCII-encoded PGP signatures.
- Video data file 428 contains image data frames 434, 436, and 438. In some embodiments, these could be identical to image data received from the camera. In other embodiments, these could be re-encoded or transcoded.
- proximity tag and sensor data file 430 contains proximity tag and sensor data in XML format. Present with each set of proximity tag and sensor data is a frame identifier associating an image data frame such as image data frame 434 or 438 with the corresponding proximity tag and sensor data.
- video data file 428 contains digital signature file 432 containing detached signatures for one or more video frames and/or the corresponding proximity tag and sensor data. Again, a frame identifier can be included with each digital signature to associate it with the corresponding image data frame.
- Video data file 428, alone or in combination with other augmented or non- augmented video files can then be used to document a scene of interest. In some cases, post-processing of one or more video data files will be employed to better reconstruct such a scene. In a law enforcement scenario, for example, at the end of a shift, a large volume of video data may be collected from officers returning to the station house. If an incident of interest occurred during that shift, then it may be desirable to immediately collect all relevant recordings.
- this can be done by searching the metadata embedded in each video for one or more proximity tags relevant to the incident (such as, for example, a proximity tag associated with an officer, vehicle, or object known to have been present at the scene). In this way, all of the augmented video data pertaining to the incident can easily be collected without error or significant labor.
- proximity tags relevant to the incident such as, for example, a proximity tag associated with an officer, vehicle, or object known to have been present at the scene.
- augmented video data can easily be searched for any video data matching certain criteria.
- video data recorded in a certain area can easily be located by searching any location data (such as GPS metadata) stored in or with the video data for corresponding coordinates.
- Recordings of DUI stops can easily be obtained by searching for video data where a proximity tag associated with a breath analyzer is present. All video recordings of an officer can be quickly collected by searching video data for a proximity tag associated with that officer's badge.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/517,368 US9159371B2 (en) | 2013-08-14 | 2014-10-17 | Forensic video recording with presence detection |
PCT/US2015/056039 WO2016061516A1 (en) | 2014-10-17 | 2015-10-16 | Forensic video recording with presence detection |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3207501A1 true EP3207501A1 (en) | 2017-08-23 |
EP3207501A4 EP3207501A4 (en) | 2018-06-06 |
Family
ID=55747438
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15850436.5A Withdrawn EP3207501A4 (en) | 2014-10-17 | 2015-10-16 | Forensic video recording with presence detection |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3207501A4 (en) |
WO (1) | WO2016061516A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10165171B2 (en) | 2016-01-22 | 2018-12-25 | Coban Technologies, Inc. | Systems, apparatuses, and methods for controlling audiovisual apparatuses |
US10789840B2 (en) | 2016-05-09 | 2020-09-29 | Coban Technologies, Inc. | Systems, apparatuses and methods for detecting driving behavior and triggering actions based on detected driving behavior |
US10152858B2 (en) | 2016-05-09 | 2018-12-11 | Coban Technologies, Inc. | Systems, apparatuses and methods for triggering actions based on data capture and characterization |
US10370102B2 (en) | 2016-05-09 | 2019-08-06 | Coban Technologies, Inc. | Systems, apparatuses and methods for unmanned aerial vehicle |
JP6747158B2 (en) | 2016-08-09 | 2020-08-26 | ソニー株式会社 | Multi-camera system, camera, camera processing method, confirmation device, and confirmation device processing method |
CN116761080B (en) * | 2022-10-13 | 2024-07-12 | 荣耀终端有限公司 | Image data processing method and terminal equipment |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9848172B2 (en) * | 2006-12-04 | 2017-12-19 | Isolynx, Llc | Autonomous systems and methods for still and moving picture production |
JP5416763B2 (en) * | 2008-05-06 | 2014-02-12 | トレースオプティックス ピーティーイー リミテッド | Method and apparatus for camera control and composition |
US8816855B2 (en) * | 2008-10-21 | 2014-08-26 | At&T Intellectual Property I, L.P. | Methods, computer program products, and systems for providing automated video tracking via radio frequency identification |
EP2449485A1 (en) * | 2009-07-01 | 2012-05-09 | E-Plate Limited | Video acquisition and compilation system and method of assembling and distributing a composite video |
US8827824B2 (en) * | 2010-08-26 | 2014-09-09 | Blast Motion, Inc. | Broadcasting system for broadcasting images with augmented motion data |
US8587672B2 (en) * | 2011-01-31 | 2013-11-19 | Home Box Office, Inc. | Real-time visible-talent tracking system |
US20130286232A1 (en) * | 2012-04-30 | 2013-10-31 | Motorola Mobility, Inc. | Use of close proximity communication to associate an image capture parameter with an image |
DE102013019488A1 (en) * | 2012-11-19 | 2014-10-09 | Mace Wolf | PHOTO WITH PROTECTION OF THE PRIVACY |
US9253452B2 (en) * | 2013-08-14 | 2016-02-02 | Digital Ally, Inc. | Computer program, method, and system for managing multiple data recording devices |
-
2015
- 2015-10-16 EP EP15850436.5A patent/EP3207501A4/en not_active Withdrawn
- 2015-10-16 WO PCT/US2015/056039 patent/WO2016061516A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
EP3207501A4 (en) | 2018-06-06 |
WO2016061516A1 (en) | 2016-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11817130B2 (en) | Forensic video recording with presence detection | |
EP3207501A1 (en) | Forensic video recording with presence detection | |
US9754630B2 (en) | System to distinguish between visually identical objects | |
US20170323663A1 (en) | Systems, apparatuses and methods for multiplexing and synchronizing audio recordings | |
US20170373859A1 (en) | Cryptographic Signature System and Related Systems and Methods | |
US10025992B1 (en) | Bulk searchable geo-tagging of detected objects in video | |
US10116910B2 (en) | Imaging apparatus and method of providing imaging information | |
US11082668B2 (en) | System and method for electronic surveillance | |
US20170277700A1 (en) | Method for Incident Video and Audio Association | |
CN103247116B (en) | The safety-protection system combined based on video analysis, space orientation, RF identification | |
US11615624B2 (en) | Automated association of media with occurrence records | |
US9628874B2 (en) | Imaging apparatus and method of providing video summary | |
TW201724866A (en) | Systems and methods for video processing | |
WO2016207899A1 (en) | System and method for secured capturing and authenticating of video clips | |
CN110022355B (en) | Storage method, verification method and equipment of environment data in specific scene | |
WO2021150429A1 (en) | Privacy-preserving video analytics | |
JP2019165431A (en) | Method and system for recording privacy compliant data | |
WO2018005761A1 (en) | Correlating multiple sources | |
JP2009152733A (en) | Person specifying system, person specifying device, person specifying method, and person specifying program | |
CN110661805B (en) | Data processing method, device, storage medium and client | |
JP2016122892A (en) | Video system | |
KR20200056656A (en) | Human and object recognition system using action cam module | |
CN106454210B (en) | Driving record image processing method and system | |
KR101527003B1 (en) | Big data system for blackbox | |
JP6713905B2 (en) | Video surveillance system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20170517 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20180504 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 5/232 20060101ALI20180427BHEP Ipc: H04N 5/77 20060101ALI20180427BHEP Ipc: G06K 9/78 20060101AFI20180427BHEP Ipc: G08B 13/196 20060101ALI20180427BHEP Ipc: H04N 9/82 20060101ALI20180427BHEP Ipc: G06K 7/10 20060101ALI20180427BHEP Ipc: H04N 5/91 20060101ALI20180427BHEP Ipc: G01S 13/74 20060101ALI20180427BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20181204 |