WO2019023389A1 - Systems and method for information capture and transmission - Google Patents

Systems and method for information capture and transmission Download PDF

Info

Publication number
WO2019023389A1
WO2019023389A1 PCT/US2018/043771 US2018043771W WO2019023389A1 WO 2019023389 A1 WO2019023389 A1 WO 2019023389A1 US 2018043771 W US2018043771 W US 2018043771W WO 2019023389 A1 WO2019023389 A1 WO 2019023389A1
Authority
WO
WIPO (PCT)
Prior art keywords
tagged
data
image data
recording device
sensor
Prior art date
Application number
PCT/US2018/043771
Other languages
French (fr)
Inventor
David Matthias Mack
Brian Dempsey Dold
Original Assignee
Senworth, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Senworth, Inc. filed Critical Senworth, Inc.
Publication of WO2019023389A1 publication Critical patent/WO2019023389A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • Computers have become highly integrated in the workforce, in the home, in mobile devices, and many other places. Computers can process massive amounts of information quickly and efficiently.
  • Software applications designed to run on computer systems allow users to perform a wide variety of functions including business applications, schoolwork, entertainment and more. Software applications are often designed to perform specific tasks, such as word processor applications for drafting documents, or email programs for sending, receiving and organizing email.
  • computer systems are configured to communicate with each other using network protocols.
  • Such computer systems can communicate with cloud servers configured to provide cloud services including software applications.
  • the processing elements 230 may store the video data 212 captured by the camera 220 in the video database 242 of the storage 240. The processing elements may continue storing video data 212 in the video database until a determination is made to stop recording (e.g., responsive to a sensor, expiration of a predetermined time from the triggering event, etc.).
  • the method 300 may include recording post-trigger information with the recording device 200 (FIGS. 1 and 2) following the determination that the triggering event occurred.
  • recording post-trigger information may include recording video data 212 with a different resolution than pre- trigger video data.
  • recording post-trigger information may include recording video data 212 with a higher resolution than pre-trigger video data.
  • the guard may include a mechanical button for the sensor 1 12B that may be pressed by a structural element of the firearm 1 1 OA (e.g., by a light attached below a barrel, etc.). Mechanical locks that hold the firearm 1 10A in place within the holster 1 10B may prevent false triggers.
  • the information capture system 100A may only include one or some of the objects 1 10A-1 10H and/or sensors 1 12A-1 12L.
  • the system 100A may only include the firearm 1 10A, the firearm holster 1 10B, and the firearm holster sensor 1 12B.
  • the information capture system 100A may include other objects and sensors instead of or in addition to the objects 1 10A-1 10H and 1 12A-1 12H.
  • the geolocation sensor 1 12L may function in conjunction with a motion detector (not shown).
  • the geolocation sensor 1 12L may function in a low-power mode to conserve power when the motion detector does not detect motion in the proximity of the
  • the recording device 500 also includes a processor 530 that is operably coupled to the data storage device 520 and the image capture device 510.
  • the processor 530 is configured to access data stored by the data storage device 500 (e.g., the image data 512, the feature data 524 of the feature database 522, the tagged image data 534, etc.).
  • the processor 530 is also configured to detect one or more of the features indicated by the feature data 524 that are desired to be identified (e.g., cars, signs, license plates, buildings, human forms, colors, or human faces) in a captured image by comparing the feature data 524 to the captured image data 512.
  • Example 28 The tagged information capture system according to any one of Examples 23-27, wherein the remote communication device includes a device of a social media service provider configured to publish the at least a tagged portion of the tagged image data.
  • Example 29 The tagged information capture system according to any one of Examples 23-28, wherein the remote communication device includes a computing device of a stand-alone control center for the tagged information capture system.

Abstract

Disclosed herein are tagged information servers. One or more tagged information servers include one or more processors, and one or more non-transitory computer-readable media including computer-readable instructions stored thereon. The computer-readable instructions are configured to instruct the one or more processors to store tagged image data on a data storage device. The tagged image data includes image data captured by a recording device and tag data correlating elements of an image corresponding to the captured image data to features that are desired to be tagged. The computer-readable instructions are also configured to instruct the one or more processors to distribute at least a tagged portion of the tagged image data to a remote communication device.

Description

SYSTEMS AND METHOD FOR INFORMATION
CAPTURE AND TRANSMISSION
Related Applications
[0001] This application claims priority to and the benefit of United States Provisional Application No. 62/607,226, filed December 18, 2017, and entitled SYSTEMS AND METHODS FOR INFORMATION CAPTURE AND TRANSMISSION, United States Provisional Application No. 62/607,228, filed December 18, 2017, and entitled SYSTEMS AND METHODS FOR INFORMATION CAPTURE AND TRANSMISSION, and United States Provisional Application 62/537,392, filed July 26, 2017, and entitled SYSTEMS AND METHODS FOR INFORMATION CAPTURE AND TAGGING, the entire disclosure of each of which is incorporated herein by reference.
Technical Field
[0002] The disclosure relates generally to systems and methods for information capture, and more particularly to sensor-triggered image data capture.
Background
[0003] Computers have become highly integrated in the workforce, in the home, in mobile devices, and many other places. Computers can process massive amounts of information quickly and efficiently. Software applications designed to run on computer systems allow users to perform a wide variety of functions including business applications, schoolwork, entertainment and more. Software applications are often designed to perform specific tasks, such as word processor applications for drafting documents, or email programs for sending, receiving and organizing email. In some cases, computer systems are configured to communicate with each other using network protocols. Such computer systems can communicate with cloud servers configured to provide cloud services including software applications.
[0004] The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
Brief Description Of The Drawings
[0005] In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
[0006] FIG. 1 is a simplified block diagram of an information capture system, according to one embodiment;
[0007] FIG. 2 is a simplified block diagram of an example of a recording device of the information capture system of FIG. 1 ;
[0008] FIG. 3 is a simplified flowchart of an example method of information capture, according to one embodiment;
[0009] FIGS. 4A and 4B illustrate a specific, non-limiting example of an information capture system, according to one embodiment;
[00010] FIG. 5A is a simplified block diagram of a recording device, according to some embodiments;
[00011] FIG. 5B is a simplified block diagram of a recording device, according to some embodiments;
[00012] FIG. 6 is a front view of a recording device, according to some embodiments;
[00013] FIG. 7 is a simplified illustration of a tagged information capture system, according to some embodiments;
[00014] FIG. 8 is a simplified illustration of a tagged information capture system, according to some embodiments; and
[00015] FIG. 9 is a simplified illustration of a tagged information distribution system, according to some embodiments.
Detailed Description
[00016] In the following detailed description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the present disclosure may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice the disclosure made herein. It should be understood, however, that the detailed description and the specific examples, while indicating examples of embodiments of the disclosure, are given by way of illustration only, and not by way of limitation. From the disclosure, various substitutions, modifications, additions, rearrangements, or combinations thereof within the scope of the disclosure may be made and will become apparent to those of ordinary skill in the art.
[00017] In accordance with common practice, the various features illustrated in the drawings may not be drawn to scale. The illustrations presented herein are not meant to be actual views of any particular apparatus (e.g., device, system, etc.) or method, but are merely idealized representations that are employed to describe various embodiments of the disclosure. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may be simplified for clarity. Thus, the drawings may not depict all of the components of a given apparatus or all operations of a particular method.
[00018] Information and signals described herein may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof. Some drawings may illustrate signals as a single signal for clarity of presentation and description. It should be understood by a person of ordinary skill in the art that the signal may represent a bus of signals, wherein the bus may have a variety of bit widths and the present disclosure may be implemented on any number of data signals including a single data signal.
[00019] The various illustrative logical blocks, modules, circuits, and algorithm acts described in connection with embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and acts are described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the disclosure described herein.
[00020] In addition, it is noted that the embodiments may be described in terms of a process that is depicted as a flowchart, a flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe operational acts as a sequential process, many of these acts can be performed in another sequence, in parallel, or substantially concurrently. In addition, the order of the acts may be re-arranged. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. Furthermore, the methods disclosed herein may be implemented in hardware, software, or both. If implemented in software, the functions may be stored or transmitted as one or more computer-readable instructions (e.g., software code) on a computer-readable medium. Computer-readable media includes both computer storage media (i.e., non-transitory media) and communication media including any medium that facilitates transfer of a computer program from one place to another.
[00021] Elements described herein may include multiple instances of the same element. These elements may be generically indicated by a numerical designator (e.g., 200) and specifically indicated by the numerical indicator followed by an alphabetic designator (e.g., 200A). For ease of following the description, for the most part, element number indicators begin with the number of the drawing on which the elements are introduced or most fully discussed. Thus, for example, element identifiers on a FIG. 1 will be mostly in the numerical format 1xx, and elements on a FIG. 3 will be mostly in the numerical format 3xx.
[00022] As used herein, the term "object with sensor" refers to any object that includes a sensor that is capable of detecting events that occur in proximity to the object.
[00023] As used herein the term "recording device" refers to devices capable of recording information corresponding to events that occur in proximity to the object. For example, recording devices may include image capture devices capable of capturing images. As used herein, the term "image capture device" refers to digital and analog image capture devices, such as, for example, digital cameras, digital camcorders, analog cameras, analog camcorders, webcams, other image capture devices known in the art, and combinations thereof. As used herein, the term "image" refers to both still images and video images. As used herein, the term "still image" refers to an image having a single frame. Also, as used herein, the term "video image" refers to an image having multiple frames. Furthermore, as used herein, the terms "image data" and "video data" refer to data corresponding to one or more images that have been captured by an image capture device. "Image data" and "video data" include sufficient information for a rendering device, such as a computing device, to reconstruct for presenting the one or more images (e.g., either of a lossless and a lossy reconstruction) corresponding to the image data. "Image data" may be analog data or digital data. "Image data" and "video data" may refer to uncompressed image data or video data, or image data or video data that has been compressed (e.g., using any of a variety of image compression protocols). "Image data" may refer to both video image data and still image data. "Video image data" refers to data corresponding to a series of still images that are configured to be viewed consecutively.
[00024] As used herein, the term "in proximity to an object" refers to locations that are close enough to the object to trigger a sensor of the object. Often, events that are close enough to the object to trigger the sensor may also be close enough to a recording device to enable the recording device to record information corresponding to the event that triggers the sensor.
[00025] Embodiments of the disclosure include various information capture systems that are automatically triggered through sensor stimuli, and related methods.
[00026] FIG. 1 is a simplified block diagram of an information capture system 100. The information capture system 100 may include an object with sensor 1 10 including an object configured to be involved with a possible event (e.g., a future, upcoming, and/or anticipated event), and one or more sensors 1 12 (hereinafter "sensor" 1 12) secured to the object. The sensor 1 12 may be configured to detect one or more stimuli that are associated with the possible event and transmit a sensor signal 172 (e.g., using one or more communication elements 1 14 operably coupled to the sensor 1 12) indicating data corresponding to the one or more stimuli. The information capture system 100 may also include one or more recording devices 200 (hereinafter "recording device" 200) configured to record information (e.g., still images, video images, audio, heat readings, and combinations thereof) responsive to a triggering event determined from the data indicated by the sensor signal 172. In this way, the recording device 200 may record information about the possible event responsive to a determination that a triggering event has occurred.
[00027] In some embodiments, the object with sensor 1 10 may include sports equipment, and the recording device 200 may be configured to record information (e.g., video) responsive to activity involving the sports equipment. By way of non- limiting example, the object with sensor 1 10 may include a surfboard, a skateboard, a snowboard, a wakeboard, a ski, a bicycle, a motorcycle, a kayak, a stand-up paddle board, a canoe, an all-terrain vehicle (ATV), an automobile, a ramp, a ball, a baseball bat, a golf club, a hockey stick, a goal (e.g., a basketball rim, a hockey or soccer goal, etc.), and other sports equipment. As a specific, non-limiting example, the object with sensor 1 10 may include a surfboard, and the recording device 200 may be configured to record video responsive to a sensor 1 12 secured to the surfboard sensing that a user stood up on the surfboard. Of course, each type of sports equipment may have different detectable stimuli associated therewith that may correspond to events of interest for recording with the recording device 200.
[00028] In some embodiments, the object with sensor 1 10 may include a wearable device. By way of non-limiting example, the object with sensor 1 10 may include a gun holster, a heart-rate monitor, a glove, an article of clothing, a hat, a helmet, a watch, a bracelet, an armband, a leg band, a headband, a shoe, and other wearable devices. As a specific, non-limiting example, the object with sensor 1 10 may include a gun holster, and the recording device 200 may include a dash video camera in a law-enforcement vehicle or a body camera worn by a law-enforcement officer wearing the gun holster. The dash video camera may record video responsive to a law-enforcement officer drawing the gun from the holster. Similarly, the body camera may begin recording video responsive to the law enforcement officer drawing the gun from the holster. Of course, many different applications may correspond to each of the different wearable devices, and may be associated with a variety of different stimuli.
[00029] Other examples of the object with sensor 1 10 may include a walking stick, a mirror, a window, a door, and any other object that may receive stimuli corresponding to possible events of interest for recording corresponding information.
[00030] In some embodiments, the sensor 1 12 may include a biometric sensor. By way of non-limiting example, the sensor 1 12 may include an accelerometer, a heart- rate sensor, a body temperature sensor, a pedometer, other biometric sensors, and combinations thereof. In such embodiments, information corresponding to events that trigger biometric responses in a person may be recorded responsive to certain biometric triggers (e.g., a heart-rate above a predetermined level, an accelerative force above a predetermined threshold, accelerometer readings corresponding to a certain type of activity, etc.).
[00031] In some embodiments the sensor 1 12 may include other accelerometers (e.g., non-biometric), a pressure sensor, a capacitive touch sensor, a heat sensor, a temperature sensor, a gyroscope, a motion sensor, an infrared sensor, a light sensor, an acoustic sensor, a moisture sensor, a strain gauge, an image sensor, a proximity sensor, an ambient light sensor, a connector that senses when the connector is connected and disconnected, a global positioning system (GPS) sensor, other sensors, and combinations thereof. Accordingly, various types of stimuli may trigger the recording device 200 to record information. In some embodiments, multiple sensors 1 12 may be included.
[00032] In some embodiments, the sensor 1 12 may be configured to transmit the sensor signal 172 wirelessly. In such embodiments, the communication elements 1 14 may include at least a wireless communication device. By way of non-limiting example, the wireless communication device may be configured to communicate using Bluetooth, low power Bluetooth, WiFi, Zigbee, mobile wireless networks (e.g., long term evolution (LTE), 3G, etc.), other wireless communication protocol, or combinations thereof. In some embodiments, the communication elements 1 14 may be configured to enable the sensor 1 12 to communicate using a wired communication link (e.g., Universal Serial Bus (USB), Firewire (IEEE 1394), Ethernet (IEEE 802.3), other wired communication links, or combinations thereof). In some embodiments, the sensor 1 12 may be configured to transmit the sensor signal 172 securely. In some embodiments, the communication elements 1 14 include a global positioning system (GPS) device (which could be used to cause the recording device 200 to trigger responsive to the recording device 200 being positioned at a predetermined position, or within a predetermined range of the predetermined position).
[00033] In some embodiments, the recording device 200 may include an image capture device (e.g., secured to a dashboard of a vehicle, to a person, to a bicycle, etc.) configured to capture one or more images responsive to the triggering event. By way of non-limiting example, the image capture device may include a video image capture device 200A (FIG. 2) configured to record video responsive to the triggering event determined from the data indicated by the sensor signal 172. In some embodiments, the recording device 200 may include an audio recording device in addition to, or instead of, an image capture device. In some embodiments, the recording device 200 may include more than one recording device. In some embodiments including multiple recording devices 200, any one of the recording devices 200 that is triggered to start recording may transmit instructions (e.g., wirelessly, via wired communications, etc.) to others of the recording devices 200 to start recording. In some embodiments, the recording device 200 may be configured to start recording responsive to a user of the information capture system manually activating the recording device, in addition to recording responsive to triggering events detected by the sensor 1 12.
[00034] In some embodiments, the sensor 1 12 may include a geolocation sensor. By way of non-limiting example, the sensor 1 12 may be configured to trigger responsive to a user or object entering or leaving vicinity (e.g., a predetermined range) of the sensor 1 12. Also by way of non-limiting example, a communication element may be placed in a desired location designed to continuously or intermittently transmit a detect signal within a predetermined range to allow geolocated activation and/or threshold activation of the recording device 200 whenever it comes within the area of interest. The recording device 200 could be configured to stop recording when it has gone outside of the range of the geolocated transmitter, which covers the area of interest. As a specific, non-limiting example, a geolocated trigger may be placed at a ski jump and a camera starts recording when a skier comes close to the jump and stops recording when the skier leaves the jump. Also by way of non-limiting example, a GPS could be used such that a recording device 200 is triggered responsive to a person or object arriving or leaving a location (e.g., a specific global position or positions) or vicinity thereof. For example, the recording device 200 could be activated by its actual global position as determined by a GPS and a user predefined location.
[00035] In some embodiments, the recording device 200 (or recording devices 200) may be configured to stop recording responsive to a triggering event detected by the sensor 1 12, responsive to a manual user input, or combinations thereof. In some embodiments where there are multiple recording devices 200, any one of the recording devices 200 that is controlled to stop recording may also communicate (e.g., wirelessly, via wired communications, etc.) to the others of the multiple recording devices 200 indicating that the others of the multiple recording devices 200 should stop recording.
[00036] As a specific non-limiting example, the recording device 200 may include a video image capture device 200A (see, e.g., FIG. 2) configured to constantly record and store a recent segment of video data, even before detecting the triggering event. The video image capture device 200A may also be configured to delete portions of the video data that were recorded at least a predetermined buffer period of time before a present time (e.g., 1 , 2, 5, 10 seconds, etc. before the present time). The video image capture device 200A may further be configured to stop deleting the video data that was recorded the predetermined buffer period of time before the present time responsive to the triggering event. In this way, the video image capture device 200A may be capable of recording video data corresponding to events leading up to the triggering event without accruing a relatively large amount of video data. One way that additional storage space may be freed up is to record video before the triggering event at a different (e.g., lower) resolution than video that is recorded after the triggering event. More detail regarding an example of a video image capture device 200A is discussed below with reference to FIG. 2.
[00037] As another specific, non-limiting example, the recording device 200 may be equipped with a low power communication element (e.g., a low power Bluetooth device) that stays continuously on. The low power communication element may be capable of receiving the sensor signal 172 and/or the trigger signal 174, and provide instructions to the recording device 200 to power on and begin recording. Accordingly, the sensor signal 172 and/or the trigger signal 174 may effectively wake up the recording device 200.
[00038] As a relatively more generalized non-limiting example, the recording device 200 may be configured to constantly record and store information, and delete the information that was recorded a predetermined buffer period of time before a present time. When a triggering event is detected from the data indicated in the sensor signal 172, the recording device 200 may be configured to stop deleting the information that was recorded the predetermined buffer period of time before the present time. In some embodiments, the recording device 200 may be configured to stop recording a predetermined amount of time after being triggered to stop recording.
[00039] In some embodiments, the recording device 200 may include a wearable recording device. By way of non-limiting examples, the recording device 200 may include a law-enforcement body camera, a helmet camera, a camera integrated into a pair of glasses, a camera integrated into a watch, other wearable recording devices, or combinations thereof.
[00040] In some embodiments, the information capture system 100 may include one or more communication hubs 150 (sometimes referred to herein simply herein as "hub" 150) in electrical communication with the sensor 1 12 and the recording device 200 (e.g., using one or more communication elements 152). The hub 150 may be configured to receive the sensor signal 172 from the sensor 1 12, and transmit a trigger signal 174 to the recording device responsive to detecting the triggering event from the sensor signal 172.
[00041] In some embodiments, the hub 150 may include a personal computing device (e.g., a smartphone, a tablet computer, a laptop computer, a desktop computer, a personal digital assistant (PDA), other personal computing device, or combinations thereof). In such embodiments, the hub 150 may be configured to communicate with at least one of the sensor 1 12 and the recording device 200 through a personal area network (PAN), a local area network (LAN), or a combination thereof with or without intervention from a wide area network (WAN) (e.g., the Internet). In some embodiments, the hub 150 may include one or more cloud server devices configured to engage in electrical communications with at least one of the sensor 1 12 and the recording device 200 through at least a WAN.
[00042] In some embodiments, the hub 150 may be configured to transmit status requests 160 to at least one of the sensor 1 12 and the recording device 120, and receive status information (e.g., sensor status 180, R.D. status 178, or a combination thereof) from the at least one of the sensor 1 12 and the recording device 120. By way of non-limiting example, the hub 150 may transmit a status request 160 requesting information indicating a battery level, health parameters, other information, and combinations thereof, to the at least one of the sensor 1 12 and the recording device 200. The hub 150 may, in response, receive at least one of the sensor status 180 and the R. D. status 178 from the sensor 1 12 and the recording device 200, respectively.
[00043] The hub 150 may include one or more processing elements 154 (e.g., a central processing unit (CPU), a microcontroller, a programmable logic controller (PLC), other processing elements, or combinations thereof) operably coupled to one or more storage devices 156 (hereinafter "storage" 156). The storage 156 may include volatile data storage (e.g., random access memory), non-volatile data storage (e.g., read-only memory, Flash memory, electrically programmable read-only memory (EPROM), compact discs (CDs), digital versatile discs (DVDs), etc.), other data storage devices, or combinations thereof. The storage 156 may be implemented with one or more semiconductor devices, optical storage media, magnetic tape, other data storage media, devices configured to read and/or write data to such data storage devices, and combinations thereof.
[00044] The storage 156 may include computer-readable instructions configured to instruct the processing elements 154 to perform operations that the hub 150 is configured to perform. By way of non-limiting example, the computer-readable instructions may be configured to instruct the processing elements 154 to analyze the data indicated by the sensor signal 172. The computer-readable instructions may also be configured to instruct the processing elements 154 to determine that a triggering event has occurred responsive to the sensor signal 172. Examples of triggering events may include sensor readings surpassing a predetermined threshold, demonstrating a recognizable pattern or output, other events, and combinations thereof.
[00045] In operation, the sensor 1 12 may detect information about events occurring in proximity to the object with sensor 1 10. The sensor 1 12 may transmit the sensor signal 172 including the information about the events to at least one of the recording device 200 and the hub 150 through the communication elements 1 14. The information from the sensor signal 172 may be processed by one of the recording device 200 and the hub 150 to determine if a triggering event occurred. If a triggering event occurred, the recording device 200 may record information corresponding to the events that occur in proximity to the object. The recording device 200 may stop recording the information a predetermined amount of time after the triggering event, in response to a manual input to the recording device 200, in response to another detected event, in response to a command received from one of the sensor 1 12 and the hub 150, or combinations thereof.
[00046] In this way, information (e.g., video data) may be recorded responsive to an event that is detectable by the sensor 1 12 without the need for a manual input or timer to start the recording. For example, a gun holster may include the sensor 1 12, and the recording device 200 may include a dashboard video recording device in a law-enforcement officer vehicle that records video responsive to the gun being drawn from the gun holster. Accordingly, potentially legally relevant video footage of events following (and even leading up to) the drawing of the gun from the gun holster may be captured by the dashboard video recording device without the need for the law enforcement officer to constantly accrue video footage or take the time to manually start the recording during a crisis or emergency. [00047] In some embodiments, the sensor signal 172 may itself be a trigger signal such that the recording device 200 starts recording responsive to receiving the sensor signal 172 (e.g., directly from the sensor 1 12 or through the hub 150). In such embodiments, the sensor signal 172 may not need to be processed by the recording device 200 or the hub 150.
[00048] In some embodiments, the object with sensor 1 10 may also include processing circuitry, similar to the hub 150. In such embodiments, processing of the sensor signal 172 may occur at the object with sensor instead of, or in addition to, at the recording device 200 or the hub 150.
[00049] In some embodiments, the recording device 200 may also be configured to record information responsive to a manual input. Accordingly, a user may start recording even if no triggering event is detected automatically from data indicated by the sensor signal 172.
[00050] FIG. 2 is a simplified block diagram of an example of a recording device 200A of the information capture system 100 of FIG. 1 . The recording device 200A may include one or more processing elements 230 (hereinafter "processing elements" 230) operably coupled to one or more communication elements 210, one or more data storage devices 240 (hereinafter "storage" 240), and at least one camera 220 (e.g., a video camera). The processing elements 230 may include processing elements similar to those discussed above with reference to the processing elements 154 of the communication hub 150 of FIG. 1. The processing elements 230 may also include hardware elements (e.g., application specific integrated circuits, field-programmable gate arrays, etc.) configured to perform specialized functions related to image capture, image data storage, and sensor data analysis.
[00051] The storage 240 may include a video database 242 configured to store video data 212 captured by the camera 220. The storage 240 may also include computer-readable instructions 244 stored thereon. The computer-readable instructions 244 may be configured to instruct the processing elements to perform functions of the recording device 200A. By way of non-limiting example, the computer-readable instructions 244 may be configured to instruct the processing elements to control the camera 220 (e.g., activating, deactivating, focusing, adjusting a viewing angle, etc.) by transmitting control signals 232 to the camera. Also by way of non-limiting example, the computer-readable instructions 244 may be configured to instruct the processing elements to communicate with at least one of the sensors 1 12 (FIG. 1 ) and the hub 150 (FIG. 1 ) through the communication elements 210. By way of non-limiting example, the computer-readable instructions 244 may be configured to instruct the processing elements 230 to respond to status requests 176 from the hub 150.
[00052] The communication elements 210 may be similar to the communication elements 1 14 of the object with sensor 1 10 and/or the communication elements 152 the communication hub 150 of FIG. 1 (e.g., including wireless communication equipment, wired communication equipment, or combinations thereof). Accordingly, the recording device 200A may be configured to communicate with at least one of the object with sensor 1 10 (FIG. 1 ), and the hub 150 (FIG. 1 ) wirelessly, and/or through wired electrical connections. Specifically, the processing elements 230 may be configured to receive at least one of a sensor signal 172, a trigger signal 174, and the status request 176 through the communication elements 210. The processing elements 230 may also be configured to transmit recording device status signals 178 (sometimes referred to herein simply as "R.D. status" 178) through the communication elements 210.
[00053] In operation, the processing elements 230 may receive one of the trigger signal 174 and the sensor signal 172 through the communication elements 210. If the trigger signal 174 is received, the processing elements 230 may transmit control signals 232 to the camera 220 instructing the camera 220 to capture video data 212 (or stop deleting pre-trigger buffer video data stored in the video database 242 if buffer video is being captured). If the sensor signal 172 is received, the processing elements 230 may, in some embodiments, process the sensor signal 172 to determine if a triggering event occurred. If the triggering event occurred, the processing elements 230 may instruct the camera 220 to capture the video data 212 (or stop deleting the pre-trigger buffer video data). The processing elements 230 may store the video data 212 captured by the camera 220 in the video database 242 of the storage 240. The processing elements may continue storing video data 212 in the video database until a determination is made to stop recording (e.g., responsive to a sensor, expiration of a predetermined time from the triggering event, etc.).
[00054] In some embodiments, the processing elements 230 may be configured to provide (through the communication elements 210) a video stream (e.g., to an electronic display or other electronic device) of the video data 212 stored in the video database 242. The video stream may include a real-time video stream or delayed video stream. In some embodiments, the processing elements 230 may be configured to share the video data 212 (or compressed versions thereof) stored in the video database 242 with a cloud storage server (not shown) remote from the recording device 200A.
[00055] FIG. 3 is a simplified flowchart of an example method 300 of information capture, according to one embodiment. The method may be performed by an information capture system, such as the information capture system 100 of FIG. 1. At operation 310, the method 300 may include analyzing sensor data from a sensor 1 12 (FIG. 1 ) secured to an object 1 10 that is configured to be involved with a possible event. In some embodiments, analyzing sensor data may include determining if a triggering event occurred. In some embodiments, determining if a triggering event occurred may include comparing the sensor data to a predetermined threshold, to a predetermined pattern, and combinations thereof.
[00056] At operation 320, the method 300 may include pre-recording pre-trigger information with a recording device 200 (FIGS. 1 and 2) configured to record information about the possible event. In some embodiments, pre-recording pre- trigger information may include maintaining a predetermined amount of pre-trigger sensor information (e.g., video data) in a database (e.g., a video database). At operation 330, the method may include deleting a portion of the pre-trigger information that was recorded a predetermined amount of time before a present time.
[00057] At operation 340, the method 300 may include triggering the recording device 200 (FIGS. 1 and 2) to stop deleting the portion of the pre-trigger information responsive to determining, from the sensor data, that a triggering event occurred. In some embodiments, determining a triggering event occurred includes determining the triggering event occurred with at least one of the sensor 1 12 (FIG. 1 ), the recording device 200 (FIGS. 1 and 2), and the hub 150 (FIG. 1 ). In some embodiments, triggering a recording device to stop deleting the portion of the pre- trigger information includes transmitting one of a trigger signal 174 and a sensor signal 172 (FIG. 1 ) to the recording device 200.
[00058] At operation 350, the method 300 may include recording post-trigger information with the recording device 200 (FIGS. 1 and 2) following the determination that the triggering event occurred. In some embodiments, recording post-trigger information may include recording video data 212 with a different resolution than pre- trigger video data. In some embodiments, recording post-trigger information may include recording video data 212 with a higher resolution than pre-trigger video data.
[00059] At operation 360, in some embodiments, the method 300 may include stopping recording responsive to a triggering event detected by the sensor 1 12.
[00060] FIGS. 4A and 4B illustrate a specific, non-limiting example of an information capture system 100A, according to some embodiments.
[00061] FIG. 4A is a simplified view of a portion of the information capture system 100A on a law enforcement officer 400 (e.g., a police officer, a special agent, a military officer, etc.). The portion of the system 100A illustrated in FIG. 4A includes a body camera device 200B worn by the law enforcement officer 400, and various objects 1 1 OA, 1 10B, 1 10C, 1 10D, 1 10E, 1 1 OF, and 1 10G (a firearm 1 1 OA, a firearm holster 1 10B, a taser 1 10C, a taser holster 1 10D, a pepper spray can 1 10E, a pepper spray holster 1 1 OF, and handcuffs 1 10G) that may be used in association with actions of the law enforcement officer 400. FIG. 4A illustrates the firearm 1 10A, the firearm holster 1 10B, the taser 1 10C, the taser holster 1 10D, the pepper spray can 1 10E, the pepper spray holster 1 1 OF, and the handcuffs 1 10G secured to the law enforcement officer 400 by a belt . The portion of the system 100A of FIG. 4A also includes a smart watch 1 12J and a wearable patch 1 12K.
[00062] FIG. 4B is a simplified block diagram of the information capture system 100A. Although not illustrated in FIG. 4A, the information capture system 100A may include a dashboard camera device 200C instead of, or in addition to, the body camera device 200B. Other recording devices may also be included. The body camera device 200B, the dashboard camera device 200C, and any other camera device may be similar to the recording device 200A of FIG. 2. As also not illustrated in FIG. 4A, the information capture system 100A may include a computer 402 (e.g., for use in an emergency vehicle, using programs such as a Computer Aided Dispatch System, a Record Management Systems, etc.) and a geo-location sensor 1 12L. As further not illustrated in FIG. 4A, the information capture system 100A may include an emergency vehicle light 1 1 OH in addition to the objects 1 10A-G illustrated in FIG. 4A. The objects 1 10A-H may include sensors 1 12A-H, respectively, configured to detect potential actions of the law enforcement officer 400 involving the use of the objects 1 10A-1 10H. [00063] By way of non-limiting example, the sensor 1 12A of the firearm 1 10A may be configured to detect when the law enforcement officer 400 is preparing to use the firearm 1 10A (e.g., the sensor 1 12A may be configured to detect a draw of the firearm 1 1 OA from the holster, a safety mechanism switching from an "on" position to an "off" position, and/or a firing of the firearm 1 10A). Also by way of non-limiting example, the sensor 1 12B of the firearm holster 1 10B may be configured to detect when the law enforcement officer 400 is preparing to use the firearm 1 10A (e.g., the sensor 1 12B may be configured to detect when the firearm 1 1 OA is withdrawn from the firearm holster 1 10B). As a specific, non-limiting example, the sensor 1 12B of the firearm holster 1 10B may include a mechanical button (a button that is normally open, normally closed, etc. that is undepressed when the firearm 1 1 OA is removed) or a flex sensor (i.e., a sensor that changes electrical properties such as resistance or capacitance responsive to a flex of the flex sensor) in the bottom of the firearm holster 1 10B that detects a withdrawal of the firearm 1 1 OA from the firearm holster 1 10B. As other specific, non-limiting examples, the sensor 1 12B may be attached to side of the holster 1 10B (e.g., an inside side of the holster 1 10B), near the top of the holster 1 10B, or anywhere within or without the holster 1 10B. In some embodiments, multiple sensors 1 12B (e.g., of the same type, of different types, etc.) may be used on or in the holster 1 10B to ensure that withdrawal of the firearm 1 10A is properly detected. As a further, non-limiting example, the sensor 1 12B or sensors 1 12B may be configured to detect movement of holster safety systems that prevent undesired drawing of the firearm, such as a self-locking system or thumb-break snaps.
[00064] It should be noted that many different types of holsters exist for many different types of firearms, and the configuration and positioning of the sensor 1 12B or sensors 1 12B may vary for different types of holsters. Accordingly, sensors compatible with holsters and firearms that are commonly used by law enforcement officers may be used. Alternatively, the holsters themselves may include sensors in some embodiments.
[00065] The sensor 1 12B or sensors 1 12B should be durable, and may be inexpensive enough to be disposable. They should be small, yet have a battery life that is relatively long (e.g., years at a time). If not built into the holster 1 10B, the sensor 1 12B should be easily applied to the holster 1 10B. One approach is to include the sensor 1 12B in a guard at the bottom of the holster 1 10B. A guard (e.g., a plastic guard) that comes with the holster 1 1 OB may be replaced with the guard including the sensor 1 12B. The guard may include a mechanical button for the sensor 1 12B that may be pressed by a structural element of the firearm 1 1 OA (e.g., by a light attached below a barrel, etc.). Mechanical locks that hold the firearm 1 10A in place within the holster 1 10B may prevent false triggers.
[00066] Another approach is to clip the sensor 1 12B to the top rim of the holster 1 10B on the outside of the holster 1 10B. The sensor 1 12B may include a flex sensor or a spring steel lever inside the holster 1 10B that runs a distance (e.g., a few inches) down the wall of the holster 1 10B from the top rim. Other approaches may involve using a hall effect sensor, a reed switch, a proximity sensor, a capacitive touch sensor, a pressure sensor, other sensors, or combinations thereof.
[00067] Combinations of sensors 1 12B may also be used on or in the holster 1 10B. For example, for looser holsters 1 10B (e.g., leather holsters), pressure sensors may be installed in multiple locations in the holster 1 10B in combination with a spring steel lever button at the top of the holster 1 10B. To avoid false positives, the cameras 200B, 200C may only be triggered if all of the pressure sensors detect no pressure and the spring steel lever button is decompressed. Accordingly, even if the firearm 1 1 OA moves around within the holster 1 10B, the cameras 200B, 200C will not trigger as long as one of the sensors 1 12B does not trigger.
[00068] As another example of a combination of sensors, a close range proximity sensor could be used in conjunction with a mechanical switch and/or flex sensor attached to the safety mechanism on the holster 1 10B that secures the firearm 1 10A. When the law enforcement officer 400 moves or unlatches the safety mechanism, the mechanical switch and/or flex sensor triggers the close range proximity sensor to power up. The proximity sensor may be used to ensure that the cameras 200B, 200C are activated only when the gun is drawn, but the proximity sensor may only be functioning after the law enforcement officer 400 removes the safety mechanisms on the holster 1 10B. As a result, errors may be prevented, and battery power for the proximity sensor may be conserved. To further conserve power, the frequency and length of time signals from the proximity sensor are transmitted may be adjusted, as long as enough of the activation signal is present to transmit the message. The transmit range of the proximity sensor can also be adjusted to turn on cameras of other nearby law enforcement officers to record sensitive situations from as many angles as possible. [00069] Similar to the sensors 1 12A and 1 12B detecting possible uses of the firearm 1 1 OA by the law enforcement officer 400, the sensors 1 12C-1 12H may be configured to detect possible uses of the objects 1 10C-1 10H corresponding thereto.
[00070] The use of the objects 1 10A-1 10H by the law enforcement officer 400 may often accompany sensitive situations in which the law enforcement officer 400 may be, among other things, engaging in combat, exerting physical force, disabling a person, restraining a person, or signaling a motorist to pull over for a traffic stop. These sensitive situations sometimes escalate, resulting in unfortunate circumstances, and even human injury or death in some situations. Conduct of the law enforcement officer 400 or those the law enforcement officer 400 interacts with may sometimes later be investigated to determine whether some improper action was taken by either party. In order to aid in these investigations, the use of the objects 1 10A-1 10H may lead to a triggering of the body camera device 200B, the dashboard camera device 200C, or a combination thereof to record video images (e.g., including video images alone, or a combination of video images and audio). The recorded video images may later be studied during post-incident investigations.
[00071] The information capture system 100A includes circuitry 402 configured to trigger (e.g., to record, to stop recording, to stop deleting recordings taken a predetermined threshold period of time before being triggered, etc.) the body camera device 200B and the dashboard camera device 200C (and any other camera devices) responsive to detections by the sensors 1 12A-H that the law enforcement officer 400 may be about to use or stop using one of the objects 1 10A-1 10H. For example, the circuitry 402 may be configured to provide a trigger signal 174 to the body camera 200B and/or the dashboard camera device 200C. In some embodiments, the trigger signal 174 may be configured to trigger the body camera device 200B and the dashboard camera device 200C to start or stop recording video images. In embodiments where there are multiple different camera devices, the camera devices may be capable of communicating with each other (e.g., wirelessly, via wired communications, etc.), and triggering each other to start or stop recording even if only one of the camera devices is triggered (e.g., by a triggering event, manually, or a combination thereof). As a specific, non-limiting example, the dashboard camera device 200C may be triggered to automatically start recording (and/or keep stored video from a predetermined buffer period of time before the dashboard camera device 200C is triggered) when the firearm 1 1 OA is drawn from the holster 1 1 OB, and to automatically stop recording when the firearm 1 1 OA is replaced into the holster 1 1 OB. The dashboard camera device 200C may also transmit signals to the body camera device 200B to start and stop recording.
[00072] In some embodiments, the trigger signal 174 may be configured to trigger the body camera device 200B and the dashboard camera device 200C to stop deleting video images that were recorded outside of a predetermined buffer period of time before the trigger so that events leading up to the use of the object 1 10A-1 10H may be recorded. In other words, the body camera device 200B and the dashboard camera device 200C may be configured to continuously record and store only a most recent portion of the video images corresponding to a predetermined length of time while deleting video images not of the most recent portion before the circuitry triggers the video recording device. Then, responsive to the trigger signal 174, the body camera device 200B and the dashboard camera device 200C may be configured to stop deleting the video images not of the most recent portion of the video images. In some embodiments, about thirty seconds of video may be maintained in the video recording device at a time before the trigger signal 174, resulting in thirty seconds of video leading up to the detected use of the object 1 10A- 1 10H.
[00073] The circuitry 402 may also be configured to provide an identification (ID) signal 404 to the body camera device 200B and the dashboard camera device 200C. The ID signal 404 identifies which of the sensors 1 12A-1 12L and/or which of the objects 1 10A-1 10H triggered the trigger signal 174. The body camera device 200B and the dashboard camera device 200C may be configured to store information (e.g., in the storage devices 240) indicating the sensors 1 12A-1 12L and/or objects 1 10A-1 10H that triggered the trigger signal 174. Accordingly, a record of not only events following and leading up to the triggering event, but also of what object or sensor triggered the triggering event, may be recorded by the body camera device 200B and the dashboard camera device 200C.
[00074] In some embodiments, the circuitry 402 includes wireless communication circuitry. By way of non-limiting example, the circuitry 402 may include low-power, local area network (LAN) wireless communication circuitry (e.g., low power Bluetooth) communicating with the communication elements 210 (e.g., low power wireless communication circuitry) of the body camera device 200B and the dashboard camera device 200C. Although some well-known local area network communications employ a pairing function between devices, low power Bluetooth may operate without pairing (which may consume less electrical power than operating with pairing). Also, low power Bluetooth enables unidirectional communications (e.g., communication from the circuitry 402 to the body camera device 200B and the dashboard camera device 200C). In some embodiments, the circuitry 402 may be configured to communicate using low power Bluetooth, without undergoing pairing functions and only engaging in unidirectional communications. In this way, power savings may enable the use of a low capacity battery (e.g., a button battery) without requiring battery replacement for months or years at a time.
[00075] In some embodiments, a pairing function may be employed between the circuitry 402 and the body camera device 200B and the dashboard camera device 200C (e.g., using conventional Bluetooth). In some embodiments, the circuitry 402 may employ other wireless communications (e.g., WiFi communications, cellular wireless networks, Zigbee networks, etc.). In some embodiments, the circuitry 402 may employ wired communications. By way of non-limiting example, the belt 460 may serve as a wire harness interfacing the sensors 1 12A-H, the circuitry 402, and the body camera device 200B. In some embodiments, the circuitry 402 may employ both wired and wireless communications.
[00076] In some embodiments, each object 1 10A-1 10H may include its own circuitry 402. In some embodiments, the circuitry 402 may be separate from the objects 1 10A-1 10H. In some embodiments, the circuitry 402 may be incorporated into the objects 1 10A-1 10H.
[00077] In some embodiments, the information capture system 100A may only include one or some of the objects 1 10A-1 10H and/or sensors 1 12A-1 12L. By way of non-limiting example, the system 100A may only include the firearm 1 10A, the firearm holster 1 10B, and the firearm holster sensor 1 12B. In some embodiments, the information capture system 100A may include other objects and sensors instead of or in addition to the objects 1 10A-1 10H and 1 12A-1 12H. By way of non-limiting example, the information capture system 100A may include body armor or a bulletproof vest equipped with a sensor or sensors, which would enable triggering of the body camera device 200B and the dashboard camera device 200C responsive to a detected impact (e.g., a gunshot or other blow to the body of the law enforcement officer 400). Also by way of non-limiting example, a bumper of a law-enforcement vehicle may be equipped with a sensor to enable triggering of the body camera device 200B and the dashboard camera device 200C responsive to an impact (e.g., an impact with another vehicle or stationary object).
[00078] As a further non-limiting example, the geolocation sensor 1 12L may trigger the body camera device 200B when the law enforcement officer 400 enters a predetermined location (e.g., a prison cell, a crime scene, etc.). The geolocation sensor 1 12L may also trigger the body camera device 200B (e.g., to record, to start accumulating or stop deleting recorded data, to stop recording, to stop accumulating or start deleting recorded data, etc.) when the law enforcement officer 400 leaves a predetermined location. By way of non-limiting example, the trigger may be responsive to the law enforcement officer 400 entering or leaving a range of the geolocation sensor 1 12L, which may be secured to a wall, a ceiling, or other stationary or mobile object that is located in a location of interest. This could also be performed with a global positioning system (GPS) device (e.g., within the smart watch 1 12J, the wearable patch 1 12K, the computer 402, etc.). For example, a trigger may occur responsive to the law enforcement officer 400 entering or leaving a predetermined location or vicinity (e.g., a preset range) of the predetermined location.
[00079] In some embodiments, the geolocation sensor 1 12L may function in conjunction with a motion detector (not shown). For example, the geolocation sensor 1 12L may function in a low-power mode to conserve power when the motion detector does not detect motion in the proximity of the
[00080] Other sensors are contemplated herein (e.g., a sensor that generates a trigger responsive to the law enforcement vehicle exceeding a predetermined speed, a sensor built into a baton, a sensor built into a knife or a knife sheathe, etc.). For example, the smart watch 1 12J and/or the wearable patch 1 12K may include biometric sensors (e.g., heartrate sensors, accelerometers, gyrometers, etc.). As a specific, non-limiting example, if a heartrate of the law enforcement officer 400 elevates above a predetermined level, it may be determined that the law enforcement officer 400 is facing a sensitive situation (e.g., a situation requiring physical strain or evoking an emotional response that elevates the law enforcement officer's 400 heartrate, etc.). Also by way of non-limiting example, an accelerometer or gyrometer may be capable of sensing motions of or impacts to the law enforcement officer 400 that are likely to be associated with the law enforcement officer 400 sustaining an injury. These and other biometrically sensed events may trigger the body camera device 200B and/or the dashboard camera device 200C.
[00081] In some embodiments, any one or more of the sensors 1 12A-H may be configured to conserve power, while achieving an optimal or desired performance. To conserve power and reach optimal performance, the range, length, and frequency of the activation signals from the sensors 1 12A-H may be customized. By increasing range of the activation signal, the sensor may be able to reach the recording devices of other officers in close proximity so that events of interest may be captured from multiple angles. By decreasing the range, greater power conservation can be achieved.
[00082] In some embodiments, the information capture system 100A may only include one of the body camera device 200B and the dashboard camera device 200C. In some embodiments, the information capture system 100A may include other recording devices 200 instead of or in addition to the body camera device 200B and the dashboard camera device 200C. By way of non-limiting example, the information capture system may include a recording device (e.g., video recording device, audio recording device, etc.) built into a pair of glasses, a helmet, a hat, or other wearable object.
[00083] In some embodiments, sensors 1 12 and cameras 200 of more than one law enforcement officer 400 may interact with each other to provide multiple triggers and/or recordings of multiple camera angles.
[00084] FIG. 5A is a simplified block diagram of a recording device 500 (e.g., that may be used for the information capture system of FIGS. 1 , 4A, 4B, or combinations thereof), according to some embodiments. The recording device 500 includes an automatic image-tagging feature. The recording device 500 includes an image capture device 510 (e.g., a camera, a video camera, etc.) configured to generate image data 512 corresponding to images of at least a portion of surroundings of the image capture device 510. By way of non-limiting example, the image capture device 510 may utilize a H.264 or MPEG-4 standard for video coding when generating image data 512. The images may be video images (e.g., a video segment of up to thirty (30) seconds of recorded video) and/or one or more still images, depending on various inputs of the recording device 500. The recording device 500 also includes a data storage device 520 configured to store a feature database 522 including feature data 524 indicating features that are desired to be identified in the images corresponding to the image data 512. By way of non-limiting example, the data storage device 520 may be a database that allows for 64GB of storage.
[00085] The recording device 500 also includes a processor 530 that is operably coupled to the data storage device 520 and the image capture device 510. The processor 530 is configured to access data stored by the data storage device 500 (e.g., the image data 512, the feature data 524 of the feature database 522, the tagged image data 534, etc.). The processor 530 is also configured to detect one or more of the features indicated by the feature data 524 that are desired to be identified (e.g., cars, signs, license plates, buildings, human forms, colors, or human faces) in a captured image by comparing the feature data 524 to the captured image data 512. Once the features are identified, the processor 530 may produce tagged image data 534 correlating the identified features of the captured image to the features of the feature data 524. In some embodiments, the tagged image data 534 may include one or more metadata tags. The processor 530 may store the tagged image data 534 as well as the captured image data 512 to the data storage device 520.
[00086] In some embodiments, the processor 530 may be configured to compress the image data 512 and/or the tagged image data 534 before it is stored on the data storage device 520. By way of non-limiting example, the processor 530 may compress the image data 512 using a context-adaptive binary arithmetic coding (CABAC) that provides context modeling.
[00087] FIG. 5B is a simplified block diagram of a recording device 500A, according to some embodiments. The recording device 500A is similar to the recording device 500 of FIG. 5A, including the data storage device 520, the feature database 522, and the processor 530. The recording device 500A, however, includes an audio capture device 514 in addition to, or instead of, the image capture device 510 of the recording device 500 of FIG. 5A. The image capture device 510 generates audio data 516 in response to acoustic waves received by the audio capture device 514. Also, the feature database 522 includes feature data 524 indicating features that are desired to be identified in captured audio corresponding to the captured audio data 516 in addition to, or instead of, the feature data 524 corresponding to the features of the captured image discussed with reference to FIG. 5A. [00088] Similarly as discussed above, the processor 530 is configured to detect one or more of the features indicated by the feature data 524 that are desired to be identified (e.g., speech, voices, or other specified sounds) in the captured audio by comparing the feature data 524 (e.g., voice recognition information or speech recognition information including a list of words of interest such as the word "gun"), to the captured audio data 516. Once the features are identified, the processor 530 may produce tagged audio data 536, correlating the identified features of the captured image to the features of the feature data 524. The tagged audio data 536 includes one or more tags. The processor 530 may provide the tagged audio data 536 as well as the captured audio data 516 to the data storage device 520. Also, in some embodiments, the processor 530 may compress the captured audio data 516 before storing the audio data 516 to the data storage device 520.
[00089] In embodiments where the recording device 500A is configured to tag features of both image data 512 and audio data 516, both image and audio features may be identified. In such embodiments, the processor 530 may be configured to tag the image data 512 and audio data 516 simultaneously. In some embodiments, the image data 512 and audio data 516 may be tagged asynchronously.
[00090] It is also contemplated herein that the processor 530 may be configured to tag data from any other sensor (e.g., the sensor signal 172 of FIGS. 1 and 2, signals from the sensors 1 12 of FIG. 4B, or combinations thereof) in addition to, or instead of, the image data 512 and or the audio data 416. In such embodiments, the feature data 524 would indicate corresponding features of the signals from the sensor. Furthermore, it is contemplated herein that the recording devices 500, 500A may be combined with disclosed features of the recording devices 200 (FIG. 1 ), 200A (FIG. 2), or 200B (FIGS. 4A and 4B) discussed above.
[00091] FIG. 6 is a simplified front view of a recording device 500B (e.g., corresponding to the recording device 500 or 500A), according to some embodiments. The recording device 500B may be configured to record images, acoustic waves, or a combination thereof within the vicinity of the recording device 500B when the recording device 500B is powered on. In one embodiment, the recording device 500B may include a record button 670 configured to trigger an image capture device 610 (e.g., corresponding to the image capture device 510 of FIG. 5A) to capture image data and/or an audio capture device (not shown) (e.g., corresponding to the audio capture device 514 of FIG. 5B) to capture audio data. A processor (not shown) (e.g., the processor 530 of FIGS. 5A and 5B) in the recording device 500B may automatically tag certain audio and/or images that are recorded. By way of non-limiting example, the processor may tag images using pixilation of images and object recognition software. In one embodiment, the object recognition software may utilize feature data that includes biometric identification information (e.g., facial image recognition information, iris scanning information, and/or retina imaging information). In one embodiment, the processor may be configured to perform facial recognition of a human face identified in the image data. The tagging process may be able to distinguish among various colors, shapes, and sizes. In one embodiment, the feature data may include advanced analytics software for non- human objects (e.g., license plate reading information).
[00092] In some embodiments, the recording device 500B may be powered on in response to stimuli (e.g., motion, noise, other triggers discussed herein, combinations thereof, etc.) and automatically activate an image capture device 610 and the audio capture device. In some embodiments, the recording device 500B or the image capture device 610 or audio capture device may only be activated manually with a switch mechanism (e.g., the record button 670). In some embodiments, the recording device 500B may include a communication device (e.g., the communication elements 210 of FIG. 2) that is configured to receive commands (e.g., a turn on command or turn off command, firmware updates, analytics such as face recognition on certain individuals, any other information to help law enforcement officers for better situational awareness and proactive policing, etc.) from a remote device (e.g., a remote server, a cloud server, the communication hubs 150 of FIG. 1 , etc.). The processor may be configured to execute the received commands.
[00093] The recording device 500B may include a housing 650 that is resistant to elements including dust particles, dirt, sand, and water. In one embodiment, the housing 650 may be such that the ingress of water in harmful quantity shall be unlikely when the recording device 500B is immersed in water under defined conditions (e.g. depth, pressure, or time). Similarly, the housing 650 may be such that any ingress of dust, dirt, and sand is unlikely to interfere with correct operation or device safety under defined conditions. The recording device 500B may include a data port, a Universal Serial Bus (USB) port, a memory card reader, other interface for transferring data, wireless communications equipment (e.g., WiFi, Cellular, Bluetooth, Zigbee, etc.) or combinations thereof. [00094] FIG. 7 is a simplified illustration of a tagged information capture system 100B, according to some embodiments. The tagged information capture system 100B includes a recording device 500 (e.g., the recording devices 500, 500A, 500B, or combinations thereof) and one or more servers 740 (sometimes referred to herein as "server" 740). As discussed with reference to FIGS. 5A-5C, the recording device 500 is configured to capture data (e.g., image data 512, audio data 516, sensor data, other data, or combinations thereof) and identify and tag features that are desired to be tagged in the captured data. The recording device 500 is also configured to generate and transmit tagged data 742 (e.g., including the tagged image data 534, the tagged audio data 536, tagged sensor data, other tagged data, or combinations thereof) to the server 740.
[00095] The server 740 (e.g., a cache server, a proxy server, a cloud server, or combinations thereof) is configured to receive and store the tagged data 742 in one or more data storage devices 746 (sometimes referred to herein as "storage" 746). In some embodiments, the server 740 may be configured to store the tagged image data into a searchable database 744. In some embodiments, the searchable database 744 may be organized based, at least in part, on search terms associated with the features desired to be tagged. By way of non-limiting example, if the recording device is used by a law enforcement officer, search terms associated with the features desired to be tagged may include "car," "sign," "license plate," "building," "person," names of colors, "human face," names of people of interest, "gun," other relevant search terms, or combinations thereof.
[00096] In some embodiments, the server 740 is configured to provide commands 750 to the recording device 500 (e.g., turn on camera, turn off camera, etc.). In some embodiments, the recording device 500 is configured to record video and store image data to a data storage device (e.g., the data storage device 520 of FIG. 5A) within the recording device 500. The recording device 500 is also configured to transmit tagged data 742 to the server 740.
[00097] In some embodiments, communications (e.g., the tagged data 742, the commands 750, etc.) between the recording device 500 and the server 740 are transmitted via an internet protocol network. By way of non-limiting example, the internet protocol network may include a wireless internet protocol network (e.g., a cellular data network). A cellular internet carrier may support the cellular data network and direct the tagged data 742 to the server 740. In some embodiments, a portion or all of the communications between the recording device 500 and the server 740 may be transmitted via one or more other wireless networks (e.g., a WiFi network, a Bluetooth network, a Zigbee network, etc.). In some embodiments, one or more wired networks (e.g., optical fiber networks, other cabled networks, or combinations thereof) may transmit a portion or all of the communications between the recording device 500 and the server 740.
[00098] FIG. 8 is a simplified illustration of a tagged information capture system 100C (e.g., corresponding to the information capture systems 100, 100A, 100B, or combinations thereof), according to some embodiments. The tagged information capture system 100C includes one or more tagged information servers 740A (sometimes referred to herein as "server" 740A), the recording device 500 (as discussed with reference to FIG. 7), and a remote communication device 840. Similarly as previously discussed with reference to FIG. 7, the recording device 500 is configured to transmit tagged data 742 to the server 740A, and receive commands 750 from the server 740A.
[00099] The server 740A includes the storage 746 discussed above with reference to FIG. 7, and the searchable database 744 of the storage 746. The server 740A also includes one or more processors 810 (sometimes referred to herein as "processor" 810) and one or more non-transitory computer-readable storage media 820 (sometimes referred to herein as "storage media" 820). In some embodiments, the storage 746 and the storage media 820 may be integrated together in one or more devices, separately, or a combination thereof. The storage media 820 includes computer-readable instructions 822 stored thereon, which are configured to instruct the processor 810 to store the tagged data 742 on the data storage device 746. As previously discussed, the tagged data 742 correlates elements of captured data captured by the recording device 500 to features desired to be tagged. The computer-readable instructions 822 also instruct the processor 810 to distribute at least a tagged portion of the tagged data 742 to the remote communication device 840.
[000100] As previously discussed, the searchable database 744 is based (e.g., organized), at least in part, on search terms associated with the features that are desired to be tagged. The computer-readable instructions 822 may also be configured to instruct the processor 810 to distribute at least a tagged portion of the tagged data 742 to the remote communication device 840. In some embodiments, the remote communication device 840 may be configured to execute a database searching tool software program 842 configured to enable a user of the remote communication device 840 to perform a search corresponding to one of the search terms (e.g., key words of interest such as the words "brown truck," "gun," etc.). The remote communication device 840 may include a computing device 844 (e.g., a stand-alone control center, a computing device of a law enforcement officer, a computing device of a computer-aided law-enforcement dispatch system, etc.) for the tagged information capture system 100C. The remote communication device 840 may be configured to display (e.g., on an electronic display) video corresponding to at least a tagged portion of the tagged data 742A.
[000101] In some embodiments, the server 740A, the remote communication device 840, or a combination thereof may be configured to remotely control the recording device 500, and/or perform analytics (e.g., face recognition analytics) on data (e.g., images, video, audio, etc.) captured by the recording device 500. The recording device 500 may be remotely controlled using the commands 750, as discussed above. In some embodiments, the commands 750 may include an on command configured to activate a camera (e.g., the image capture device 610 of FIG. 6) of the recording device 500, an off command configured to deactivate the camera, update firmware of the recording device 500, or a combination thereof.
[000102] In some embodiments, the information capture system 100C includes a device of a social media service provider configured to publish at least a tagged portion of the tagged data 742. In some embodiments, however, one or more social media platforms 960A-D separate from the remote communication device 840 may be included (see FIG. 9).
[000103] In some embodiments, the computer-readable instructions 822 are configured to store a user profile for a user of the remote communication device on the storage device 746. Information of the user profile may be used to enable distribution of the tagged data 742 to certain authorized users. In some embodiments, the database searching tool software program 842 may be configured to prompt the user for login credentials, and provide the user access to all or a portion of the searchable database 744 responsive to receiving authentic login credentials from the user. Users that do not provide authentic login credentials may be denied access to the searchable database 744. [000104] FIG. 9 is a simplified illustration of a tagged information distribution system 900, according to some embodiments. The tagged information distribution system 900 includes one or more tagged information servers 740B (sometimes referred to herein as "server" 740B), a remote communication device 840, and one or more social media platforms 960A-D. By way of non-limiting example, the social media platforms 960A-D may include server devices for different social media services (e.g., Instagram, Facebook, Twitter, Snapchat, other social media platforms, or combinations thereof). The server 740B is configured to publish at least a portion of the tagged data 742 to the social media platforms 960A-D. Although four different social media platforms 960A-D are illustrated in FIG. 9, it is contemplated herein that the system 900 may include any number of social media platforms 960A-D including only one social media platform 960, and any number greater than one (including numbers less than or exceeding four).
[000105] Similarly as previously discussed with reference to FIG. 8, the server 740B includes one or more processors 810, one or more non-transitory computer-readable storage media 820, and a data storage device 746. The storage media 820, however, includes computer-readable instructions 822A that are configured to instruct the processor 810 to distribute at least a tagged portion of the tagged data 742 to the social media platforms 960A-D. The computer-readable instructions 822A may also be configured to instruct the processor 810 to perform the operations of the computer-readable instructions 822A discussed above with reference to FIG. 8.
[000106] In some embodiments, the computer readable instructions 822A may be configured to instruct the processor 810 to store a user profile for a user of the remote communication device 840 (e.g., on the data storage device 746). The user profile may include information identifying social media platforms 960A-D that the user corresponding to the user profile has authorized the server 740B to access. In some embodiments, the remote communication device 840 may be configured to prompt the user for social media credentials for accessing user profiles of the one or more social media platforms 960A-D. In some embodiments, the computer-readable instructions 822A are configured to instruct the processor 810 to detect incoming tagged data 742 from a recording device (not shown) (e.g., corresponding to the recording devices 500, 500A, 500B, or combinations thereof) and distribute at least a portion of the tagged data 742 to one or more host devices of the social media platforms 960A-D for publishing data (e.g., images, audio, other sensor data, etc.) corresponding to at least a portion of tagged data 742.
[000107] As previously discussed, computer readable instructions 822A stored on the storage media 820 may, according to some embodiments, be configured to instruct the processor 810 to distribute at least a tagged portion of tagged data 742 to the remote communication device 840. In some embodiments, the remote communication device 840 may include one or more application program interfaces whereupon a user elects to publish tagged data 742 to the social media platforms 960A-D.
[000108] By way of non-limiting example, a law enforcement dispatcher or officer may be enabled to configure (e.g., using the recording device 500, the remote communication device 840, some other device, or combinations thereof) the server 740B to automatically publish tagged data 742 that corresponds to a particular organizational term of the searchable database 744. As a specific, non-limiting example, during a natural disaster, the server 740B may be configured to publish tagged data 742 that corresponds to the search term "shelters," or "dangerous roadways" to the social media platforms 960A-D. Those of the general public that are associated (e.g., "following," etc.) with the server 740B in some way may then have access to the tagged data 742.
[000109] In some embodiments, the remote communication device 840 may be configured to enter communication with the social media platforms 960A-D directly (e.g., through the internet) without intervention from the server 740B. By way of non- limiting example, the remote communication device 840 may be configured to execute software applications (sometimes known as "Apps") for the social media platforms 960A-D. In some such embodiments, the user may be enabled by the software applications to adjust certain settings (e.g., define permissions, identify what type of data to publish, delete previously published data, etc.) using the remote communication device 840. In some embodiments, communications may be made between the remote communication device 840 and the social media platforms 960A-D only through the server 740B, or both directly between the remote communication device 840 and the social media platforms 960A-D and through the server 740B. In some embodiments, the remote communication device 840 may not communicate with the social media platforms 960A-D, or only with some of the social media platforms 960A-D. [000110] In some embodiments, the server 740B includes an application programming interface (API) stored thereon and configured to enable the server 740B to interface with one or more of the social media platforms 960A-D. In some embodiments, the server 740B includes an API for each of the social media platforms 960A-D.
Examples
[000111] The following is a non-exhaustive list of example embodiments that fall within the scope of the disclosure. In order to avoid complexity in providing the disclosure, not all of the examples listed below are separately and explicitly disclosed as having been contemplated herein as combinable with all of the others of the examples listed below and other embodiments disclosed hereinabove. Unless one of ordinary skill in the art would understand that these examples listed below, and the above disclosed embodiments, are not combinable, it is contemplated within the scope of the disclosure that such examples and embodiments are combinable.
[000112] Example 1 : A recording device, comprising: an image capture device configured to generate image data corresponding to images of at least a portion of surroundings of the image capture device; a data storage device configured to store a feature database including feature data indicating features that are desired to be identified in the images corresponding to the image data; and a processor operably coupled to the data storage device and the image capture device, the processor configured to: detect one or more of the features that are desired to be identified in a captured image corresponding to captured image data; store tag data indicating the tagged features of the captured image on the data storage device; and store the captured image data on the data storage device.
[000113] Example 2: The recording device of Example 1 , wherein the captured image comprises a video image.
[000114] Example 3: The recording device according to any one of Examples 1 and 2, wherein the video image includes a video segment of up to thirty (30) seconds of recorded video.
[000115] Example 4: The recording device according to any one of Examples 1 -3, wherein the processor is further configured to compress the captured image data before storing the captured image data on the data storage device. [000116] Example 5: The recording device according to any one of Examples 1 -4, wherein the features indicated by the feature data stored in the data storage device include cars, signs, license plates, buildings, human forms, colors, or human faces.
[000117] Example 6: The recording device according to any one of Examples 1 -5, further comprising an audio capture device configured to generate audio data responsive to acoustic waves received by the audio capture device, wherein: the feature data also indicates features that are desired to be identified in captured audio corresponding to captured audio data; and the processor is also configured to: detect one or more of the features that are desired to be identified in the captured audio corresponding to the captured audio data; store tag data indicating the tagged features of the captured audio on the data storage device; and store the captured audio data on the data storage device.
[000118] Example 7: The recording device of Example 6, wherein the feature data indicates voice recognition information.
[000119] Example 8: The recording device according to any one of Examples 6 and 7, wherein the feature data indicates speech recognition information.
[000120] Example 9: The recording device of Example 8, wherein the speech recognition information includes a list of words of interest, the list of words of interest including the word "gun."
[000121] Example 10: The recording device according to any one of Examples 1 -9, further comprising a record button configured to trigger the image capture device to capture the image data.
[000122] Example 1 1 : The recording device according to any one of Examples 1 -10, wherein: the feature data also indicates facial recognition information; and the processor is configured to perform facial recognition of a human face identified in the image data.
[000123] Example 12: The recording device according to any one of Examples 1 -1 1 , further comprising a communication device configure to receive commands from a remote device, wherein the processor is configured to execute the commands.
[000124] Example 13: The recording device of Example 12, wherein the commands received from the remote device include a turn on camera command and a turn off camera command.
[000125] Example 14: A tagged information capture system, comprising: a recording device configured to: capture image data; identify and tag features that are desired to be tagged in the captured image data; and transmit the tagged image data; and one or more servers to receive and store the tagged image data transmitted from the recording device.
[000126] Example 15: The tagged information capture system of Example 14, wherein the image data includes video image data.
[000127] Example 16: The tagged information capture system according to any one of Examples 14 and 15, wherein the one or more servers include a cloud server.
[000128] Example 17: The tagged information capture system according to any one of Examples 14-16, wherein the recording device is configured to transmit the tagged image data to the one or more servers via a wireless internet protocol network.
[000129] Example 18: The tagged information capture system of Example 17, wherein the wireless internet protocol network includes a cellular data network.
[000130] Example 19: The tagged information capture system of Example 18, wherein a cellular internet carrier supporting the cellular data network is configured to direct the tagged image data to the one or more servers.
[000131] Example 20: The tagged information capture system according to any one of Examples 14-19, wherein the recording device is further configured to: capture audio data; identify and tag features that are desired to be tagged in the captured audio data; and transmit the tagged audio data to the one or more servers.
[000132] Example 21 : The tagged information capture system of Example 20, wherein the one or more servers are configured to receive and store the tagged audio data.
[000133] Example 22: The tagged information capture system according to any one of Examples 14-21 , wherein the one or more servers are configured to store the tagged image data into a searchable database organized based, at least in part, on search terms associated with the features that are desired to be tagged.
[000134] Example 23: One or more tagged information servers, comprising: one or more processors; and one or more non-transitory computer-readable storage media including computer readable instructions stored thereon, the computer-readable instructions configured to instruct the one or more processors to: store tagged image data on a data storage device, the tagged image data including image data captured by a recording device and tag data correlating elements of an image corresponding to the captured image data to features that are desired to be tagged; and distribute at least a tagged portion of the tagged image data to a remote communication device.
[000135] Example 24: The tagged information capture system of Example 23, wherein the tagged image data includes tagged video data.
[000136] Example 25: The tagged information capture system of Example 24, wherein the tagged video data includes a tagged video image portion and a tagged audio portion.
[000137] Example 26: The tagged information capture system according to any one of Examples 23 and 24, wherein the computer-readable instructions are configured to instruct the one or more processors to store the tagged image data into a searchable database stored on the data storage device, the searchable database organized based, at least in part, on search terms associated with the features that are desired to be tagged.
[000138] Example 27: The tagged information capture system of Example 26, wherein the computer-readable instructions are configured to instruct the one or more processors to distribute the at least a tagged portion of the tagged image data to the remote communication device responsive to the remote communication device executing a database searching tool software program during which a user of the remote communication device performs a search for one of the search terms using the database searching tool software program.
[000139] Example 28: The tagged information capture system according to any one of Examples 23-27, wherein the remote communication device includes a device of a social media service provider configured to publish the at least a tagged portion of the tagged image data.
[000140] Example 29: The tagged information capture system according to any one of Examples 23-28, wherein the remote communication device includes a computing device of a stand-alone control center for the tagged information capture system.
[000141] Example 30: The tagged information capture system according to any one of Examples 23-29, wherein the remote communication device includes a computing device of a computer-aided law-enforcement dispatch system.
[000142] Example 31 : The tagged information capture system according to any one of Examples 23-30, wherein the remote communication device includes a computing device of a law enforcement officer. [000143] Example 32: The tagged information capture system according to any one of Examples 23-31 , wherein the remote communication device is configured to display video corresponding to the at least a tagged portion of the tagged image data.
[000144] Example 33: The tagged information capture system according to any one of Examples 23-32, wherein the one or more servers, the remote communication device, or both the one or more servers and the remote communication device are configured to remotely control the recording device.
[000145] Example 34: The tagged information capture system according to any one of Examples 23-33, wherein the commands for remotely controlling the recording device include one or more command selected from the group consisting of: an on command configured to activate a camera of the recording device; an off command configured to deactivate the camera of the recording device; and update firmware of the recording device.
[000146] Example 35: The tagged information capture system according to any one of Examples 23-34, wherein the one or more servers, the remote communication device, or both the one or more servers and the remote communication device are configured to perform analytics on the image data captured by the recording device.
[000147] Example 36: The tagged information capture system of Example 35, wherein the analytics include face recognition analytics.
[000148] Example 37: The tagged information capture system according to any one of Examples 23-36, wherein the computer-readable instructions are configured to instruct the one or more processors to store a user profile for a user of the remote communication device on the data storage device.
[000149] Example 38: The tagged information capture system of Example 37, wherein the computer-readable instructions are configured to instruct the one or more processors to prompt the user for social media credentials for one or more social media platforms.
[000150] Example 39: The tagged information capture system of Example 38, wherein the computer-readable instructions are configured to instruct the one or more processors to transmit the at least a tagged portion of the tagged image data to one or more host devices of the one or more social media platforms for publishing images corresponding to the at least a tagged portion of the tagged image data. [000151] Example 40: The tagged information capture system according to any one of Examples 38 and 39, wherein the computer-readable instructions are configured to instruct the one or more processors to: detect incoming tagged image data from the recording device; and distribute the incoming tagged image data to each of the one or more host devices of the one or more social media platforms for publishing.
[000152] While certain illustrative embodiments have been described in connection with the figures, those of ordinary skill in the art will recognize and appreciate that embodiments encompassed by the disclosure are not limited to those embodiments explicitly shown and described herein. Rather, many additions, deletions, and modifications to the embodiments described herein may be made without departing from the scope of embodiments encompassed by the disclosure, such as those hereinafter claimed, including legal equivalents. In addition, features from one disclosed embodiment may be combined with features of another disclosed embodiment while still being encompassed within the scope of embodiments encompassed by the disclosure, as contemplated by the inventors.

Claims

Claims
1 . One or more tagged information servers, comprising:
one or more processors; and
one or more non-transitory computer-readable storage media including computer-readable instructions stored thereon, the computer-readable instructions configured to instruct the one or more processors to:
store tagged image data on a data storage device, the tagged image data including image data captured by a recording device and tag data correlating elements of an image corresponding to the captured image data to features that are desired to be tagged; and
distribute at least a tagged portion of the tagged image data to a remote communication device.
2. The tagged information capture system of claim 1 , wherein the computer-readable instructions are configured to instruct the one or more processors to store the tagged image data into a searchable database stored on the data storage device, the searchable database organized based, at least in part, on search terms associated with the features that are desired to be tagged.
3. The tagged information capture system of claim 2, wherein the computer-readable instructions are configured to instruct the one or more processors to distribute the at least a tagged portion of the tagged image data to the remote communication device responsive to the remote communication device executing a database searching tool software program during which a user of the remote communication device performs a search for one of the search terms using the database searching tool software program.
4. The tagged information capture system of claim 1 , wherein the remote communication device includes a computing device of a stand-alone control center for the tagged information capture system.
5. The tagged information capture system of claim 1 , wherein the remote communication device includes a computing device of a computer-aided law- enforcement dispatch system.
6. The tagged information capture system of claim 1 , wherein the remote communication device includes a computing device of a law enforcement officer.
7. The tagged information capture system of claim 1 , wherein the remote communication device is configured to display video corresponding to the at least a tagged portion of the tagged image data.
8. The tagged information capture system of claim 1 , wherein the one or more servers, the remote communication device, or both the one or more servers and the remote communication device are configured to remotely control the recording device.
9. The tagged information capture system of claim 8, wherein the commands for remotely controlling the recording device include one or more command selected from the group consisting of:
an on command configured to activate a camera of the recording device;
an off command configured to deactivate the camera of the recording device; and
update firmware of the recording device.
10. The tagged information capture system of claim 1 , wherein the one or more servers, the remote communication device, or both the one or more servers and the remote communication device are configured to perform analytics on the image data captured by the recording device.
1 1 . The tagged information capture system of claim 10, wherein the analytics include face recognition analytics.
12. The tagged information capture system of claim 1 , wherein the computer-readable instructions are configured to instruct the one or more processors to transmit the at least a tagged portion of the tagged image data to one or more host devices of one or more social media platforms for publishing images corresponding to the at least a tagged portion of the tagged image data.
13. The tagged information capture system of claim 12, wherein the computer-readable instructions are configured to instruct the one or more processors to:
detect incoming tagged image data from the recording device; and
distribute the incoming tagged image data to each of the one or more host devices of the one or more social media platforms for publishing.
14. A tagged information capture system, comprising:
a recording device configured to:
capture image data; identify and tag features that are desired to be tagged in the captured image data; and
transmit the tagged image data; and
one or more servers configured to receive and store the tagged image data transmitted from the recording device.
15. The tagged information capture system of claim 14, wherein the recording device is further configured to:
capture audio data;
identify and tag features that are desired to be tagged in the captured audio data; and
transmit the tagged audio data to the one or more servers.
16. A recording device, comprising:
an image capture device configured to generate image data corresponding to images of at least a portion of surroundings of the image capture device;
a data storage device configured to store a feature database including feature data indicating tagged features that are desired to be identified in the images corresponding to the image data; and
a processor operably coupled to the data storage device and the image capture device, the processor configured to:
detect one or more of the tagged features that are desired to be identified in a captured image corresponding to captured image data;
store tag data indicating the tagged features of the captured image on the data storage device; and
store the captured image data on the data storage device.
17. The recording device of claim 16, wherein the tagged features indicated by the feature data stored in the data storage device include cars, signs, license plates, buildings, human forms, colors, or human faces.
18. The recording device of claim 16, further comprising an audio capture device configured to generate audio data responsive to acoustic waves received by the audio capture device, wherein:
the feature data also indicates tagged features that are desired to be identified in captured audio corresponding to captured audio data; and
the processor is also configured to: detect one or more of the tagged features that are desired to be identified in the captured audio corresponding to the captured audio data; store tag data indicating the tagged features of the captured audio on the data storage device; and
store the captured audio data on the data storage device.
19. The recording device of claim 18, wherein the feature data indicates at least one of voice recognition information and speech recognition information.
20. The recording device of claim 16, wherein:
the feature data also indicates facial recognition information; and
the processor is configured to perform facial recognition of a human face identified in the image data.
PCT/US2018/043771 2017-07-26 2018-07-25 Systems and method for information capture and transmission WO2019023389A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201762537392P 2017-07-26 2017-07-26
US62/537,392 2017-07-26
US201762607228P 2017-12-18 2017-12-18
US201762607226P 2017-12-18 2017-12-18
US62/607,226 2017-12-18
US62/607,228 2017-12-18

Publications (1)

Publication Number Publication Date
WO2019023389A1 true WO2019023389A1 (en) 2019-01-31

Family

ID=65040958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/043771 WO2019023389A1 (en) 2017-07-26 2018-07-25 Systems and method for information capture and transmission

Country Status (1)

Country Link
WO (1) WO2019023389A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150261789A1 (en) * 2012-03-26 2015-09-17 Amazon Technologies, Inc. Cloud-based photo management
US20160173833A1 (en) * 2008-01-24 2016-06-16 Micropower Technologies, Inc. Video Delivery Systems Using Wireless Cameras

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160173833A1 (en) * 2008-01-24 2016-06-16 Micropower Technologies, Inc. Video Delivery Systems Using Wireless Cameras
US20150261789A1 (en) * 2012-03-26 2015-09-17 Amazon Technologies, Inc. Cloud-based photo management

Similar Documents

Publication Publication Date Title
US11043242B2 (en) Systems and methods for information capture
WO2020125406A1 (en) Safety guardianship method, apparatus, terminal and computer readable storage medium
US20180160070A1 (en) Body camera
US9654978B2 (en) Asset accessibility with continuous authentication for mobile devices
US9773105B2 (en) Device security using user interaction anomaly detection
WO2019022935A1 (en) Object detection sensors and systems
US20190035104A1 (en) Object detection and tracking
US10535145B2 (en) Context-based, partial edge intelligence facial and vocal characteristic recognition
US11228736B2 (en) Guardian system in a network to improve situational awareness at an incident
US11463663B2 (en) Camera glasses for law enforcement accountability
US20170262706A1 (en) Smart tracking video recorder
US11074804B2 (en) Wearable personal security devices and systems
US20210326563A1 (en) Electronic fingerprint device for identifying perpetrators and witnesses of a crime and method thereof
CN104332037B (en) method and device for alarm detection
US20200074839A1 (en) Situational awareness platform, methods, and devices
US20190392692A1 (en) Electronic fingerprint device for identifying perpetrators and witnesses of a crime and method thereof
US10298875B2 (en) System, device, and method for evidentiary management of digital data associated with a localized Miranda-type process
WO2015149107A1 (en) Method and system for tracking assets
US10586434B1 (en) Preventing unauthorized access to audio/video recording and communication devices
US10339777B2 (en) Identifying an individual based on an electronic signature
WO2019023389A1 (en) Systems and method for information capture and transmission
US20210217293A1 (en) Wearable personal security devices and systems
WO2020006189A1 (en) A wearable camera system for crime deterrence
US20150356847A1 (en) Security monitoring device and method of monitoring a location
KR20150042918A (en) Terminal for ward and guarddian, cloud server, their method of operation and recording medium for safety management of ward

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18839354

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24/06/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18839354

Country of ref document: EP

Kind code of ref document: A1