US12363261B2 - In-band video communication - Google Patents
In-band video communicationInfo
- Publication number
- US12363261B2 US12363261B2 US18/313,980 US202318313980A US12363261B2 US 12363261 B2 US12363261 B2 US 12363261B2 US 202318313980 A US202318313980 A US 202318313980A US 12363261 B2 US12363261 B2 US 12363261B2
- Authority
- US
- United States
- Prior art keywords
- metadata
- video
- frames
- sensor device
- performance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2389—Multiplex stream processing, e.g. multiplex stream encrypting
- H04N21/23892—Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0117—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/10—Recognition assisted with metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definitions
- the present disclosure relates generally to a video surveillance management system, and more specifically, to in-band video communication.
- the present disclosure includes a method for video management within a CCTV system.
- the method includes receiving, at a computer device via one or more intermediate devices in the CCTV system, a video stream generated by a sensor device of the CCTV system.
- the video stream includes a plurality of video frames.
- the computer device sends, via the one or more intermediate devices of the CCTV system, an instruction to a sensor device configured to generate a video stream including a plurality of video frames.
- the computer device receives, via the one or more intermediate devices of the CCTV system, one or more frames of the plurality of video frames embedded with metadata associated with performance of the instruction by the sensor device. Performance of the CCTV system is evaluated using the metadata embedded within the one or more video frames.
- Additional aspects may include an apparatus including a memory and a processor coupled with the memory and configured to perform either of the above-noted methods. Further aspects may include a computer-readable medium storing instructions executable by a processor to perform either of the above-noted methods.
- FIG. 1 is a block diagram of a system for video management within CCTV systems
- FIG. 2 is a block diagram illustrating an example of video traversing through a number of intermediate devices before being displayed to a user of the CCTV system;
- FIG. 4 is an example of metadata transmitted within a video stream
- FIG. 6 illustrates an example of a general-purpose computer system
- FIG. 7 A is a block diagram illustrating an example of video traversing in a CCTV system
- FIG. 7 D is a block diagram illustrating an example of video traversing in a CCTV system including compatible and incompatible intermediate devices
- FIG. 8 is a flowchart of an example method for video management by a video destination.
- the inserted metadata may be used to validate video transmission through a series of devices despite intermediate equipment being unaware of the additional metadata within the transmitted stream.
- the metadata may also aid performance measurements, such as round-trip latency within a CCTV system.
- FIG. 1 is a block diagram illustrating an example system 100 for video management within CCTV systems.
- system 100 illustrates CCTV systems 102 (including systems 102 A through 102 C), CCTV application server 116 , sensors (e.g., video cameras) 110 (including sensors 110 A through 110 G), monitoring devices 124 (including devices 124 A and 124 B), and networks including, for example, Local Area Network (LAN) 108 , and Internet Protocol (IP) Network 104 .
- LAN Local Area Network
- IP Internet Protocol
- One or more of the devices or systems shown may be implemented on one or more computing devices using hardware, software, or a combination thereof.
- Such a computing device is shown in FIG. 6 and may include, but is not limited to, a device having a processor and memory, including a non-transitory memory, for executing and storing instructions.
- the memory may tangibly embody the data and program instructions.
- Software may include one or more applications and an operating system.
- Hardware may include, but is not limited to, a processor, memory, and graphical user interface display.
- the computing device may also have multiple processors and multiple shared or separate memory components.
- the computing device may be a part of or the entirety of a clustered computing environment.
- CCTV system 102 A-C may each include a variety of network interfaces (not shown) including wireless and wired network interfaces to support a variety of network protocols.
- Each CCTV system 102 may implement and host one or more CCTV applications for management of one or more video cameras 110 across one or more networks.
- CCTV system 102 A may manage or monitor video cameras 110 E-G through LAN 108 and video cameras 110 A-D through Internet Protocol (IP) network 104 .
- IP Internet Protocol
- Video cameras 110 may be mounted surveillance video cameras that are analog cameras or IP cameras, which may be powered over Ethernet cables (PoE). Such surveillance cameras may have pan, tilt, and zoom (PTZ) capabilities.
- video camera 110 may capture a video feed and may insert metadata associated with an operating status of the video camera to be transmitted, via a network, to one or more of CCTV systems 102 managing video camera 110 .
- the metadata may include at least one of the following information related to video camera 110 : unique video camera identifier (ID), group video camera ID, one or more associated user IDs, geolocation information, timestamp of captured video, direction, inclination, angle, moment, velocity, acceleration, or other environmental parameters.
- ID unique video camera identifier
- group video camera ID one or more associated user IDs
- geolocation information timestamp of captured video
- timestamp of captured video direction, inclination, angle, moment, velocity, acceleration, or other environmental parameters.
- video cameras 110 A-D may be representative of surveillance cameras mounted on the exterior of and/or within office buildings, near bus stops, at traffic intersections, etc.
- Video cameras 110 A-B may be managed through CCTV server 116 having connection to IP network 104 .
- video cameras such as video cameras 110 D
- video cameras, such as video camera 110 C may include networking hardware to connect to IP network 104 directly.
- one or more video cameras 110 such as video camera 110 C, may be wearable video cameras.
- NVR 117 may be one or more video storage devices having application software for storing, searching, and retrieving stored video files.
- NVR 117 may store video files in association with metadata tags, keywords, and other associated information files that are relationally associated with the video file and/or which are embedded as metadata within the video format and stored file.
- NVR 117 may enable an operator accessing the CCTV application to not only view a live video feed but also retrieve stored video feeds of video cameras 110 based on one or more search criteria.
- an operator may input a search criterion within the CCTV application managed by CCTV application server 116 .
- NVR 117 may receive the search criterion and query the stored video files based on the search criterion to locate one or more video files associated with the search criterion.
- NVR 117 may be representative of a system of devices, each of which may be implemented on one or more processors.
- FIG. 2 is a block diagram of illustrating an example of video traversing through a number of intermediate devices before being displayed to a user of the CCTV system.
- a camera 110 is the source of the video stream, which is received by a recorder 117 (such as NVR) for initial, short term storage.
- NVRs 117 are typically built around and deployed with memories of limited sizes, in order to contain cost. While mass storage and mass storage management may be included in such devices, for example, a computer-type hard drive.
- a ‘lightweight’ memory solution is envisioned in an aspect, in which an abbreviated memory or memory buffer is used to temporarily store video streams collected during a predetermined time of service, for example a day. Periodically or upon user's request, NVR 117 may transfer the collected video streams to a different memory for management and analysis, such as an archive 115 .
- a transcoder 202 may alter the video stream by decoding (for example, from an H.264 format), altering (e.g., reducing the resolution) and encoding once more (for example, back to the H.264 format).
- decoding for example, from an H.264 format
- altering e.g., reducing the resolution
- encoding once more (for example, back to the H.264 format).
- the ability to perform transcoding on the fly may simplify storage and management involved with supporting adaptive streaming. It will be appreciated that by generating requested video stream portions on the fly, the amount of storage required to implement adaptive streaming may be reduced significantly.
- monitoring devices 124 often may need to retrieve previously stored video data from a data archive library systems 115 that may be connected to the CCTV application server 116 and/or NVR 117 for data archiving.
- the data archiving solutions 115 may employ one or more tape library systems.
- the desired data may be identified based on information other than the data itself, before the entire set of video streams are retrieved from the tape library to the user.
- Such a technique for screening of data improves the overall performance and usability of the archive system 115 .
- video streams fetched from the archive system 115 may be viewed by users using a computing device having a video display 204 .
- devices 110 - 117 and 202 - 204 may be useful for devices 110 - 117 and 202 - 204 to communicate with each other. It should be noted that at least some of the devices 110 - 117 and 202 - 204 may be from different vendors and could be incompatible with each other.
- the inter-device communication is important because each of the aforementioned devices 110 - 117 and 202 - 204 serve a particular function.
- the computing device 204 may, for example, need to verify that: 1) every video frame is present in the received video stream; 2) every video frame is received in the correct order; 3) every video frame is received from the correct camera; and 4) every video frame is received with the correct timing data.
- the CCTV application server 116 may be configured to provide a GUI to perform contiguous video camera alignment. This functionality may be provided via additional user interface elements included in the above-mentioned GUI for specifying lane/path boundaries.
- Contiguous video camera alignment refers to the fact that during an initial camera hardware installation process, or if existing camera hardware is later updated, if the camera 110 is equipped with PTZ function, the pan, tilt, and/or zoom settings of the camera 110 need to be adjusted and/or calibrated to ensure there is sufficient overlap in the field of vision between cameras.
- the CCTV application server 116 can review the received metadata to determine whether a handoff zone of sufficient size and/or duration is provided by a pair of cameras.
- pan, tilt, and/or zoom settings may require changes, or the physical installation location of a camera may need to be changed altogether.
- the metadata within the analyzed video stream could contain acknowledgements of PTZ instructions, allowing the computing device 204 receiving the video stream to measure PTZ round trip latency, for example.
- a video stream comprising a plurality of video frames 302 is transmitted from the video camera 110 to the computing device 204 via the recorder device 117 .
- the plurality of video frames 302 include frame #1 through frame #23.
- the cross-hatching or lack thereof in the box representing each frame represents frames captured before the instruction 302 is received by the camera 110 (e.g., frame #1 through frame #14), a frame in which the instruction 302 is received and processed (e.g., frame #15), and frames captured after the instruction 302 is executed by the camera 110 (e.g., frame #16 through frame #23).
- the recorder device 117 in this example comprises an intermediate device.
- latency measurements can also be useful during deployment of CCTV equipment.
- factors such as network topology, equipment selection and protocol choice may influence the overall performance of the system.
- system performance may be improved by diagnosing existing issues and by facilitating system tuning.
- the play-out of one media stream may constitute a reference for the play-out of another media stream, e.g., an audio stream, or vice versa, e.g., to achieve lip synchronization.
- Metadata in both audio and video channels can be used to measure this lip synchronization.
- FIG. 4 is an example of metadata transmitted within a video stream.
- each frame of a video stream may contain embedded content that conveys the data to be transmitted. Such embedded content should be able to survive intermediate transmission and modification by intermediate devices. Furthermore, such embedded content should be recognizable and decodable by compatible receiver devices.
- inserting metadata into the one or more video frames of the annotated video stream may include inserting a 2-dimensional bar code representing the metadata into the one or more video frames.
- One such inserting technique could be to place a QR code 400 within each image.
- QR codes 400 can be of a same size, or of a threshold number of sizes with each size being visually identifiable (e.g., each size may be a particular color). Furthermore, the QR code 400 may occupy the entire video frame or a portion of the video frame.
- a further aspect is that a device inserting the QR code 400 (for example, video camera 110 ) can choose to have different QR codes 400 be placed in each video frame or each QR code 400 may linger for a few frames. If each frame stores a number of metadata bits and video has a frame rate, the bitrate of the disclosed metadata communication channel may be determined by multiplying the bitrate and the frame rate.
- the metadata may be encoded by inserting a series of pixels along the edge of the image.
- the conventional vertical blanking interval (VBI) of a transmitted video signal may preferably be used to carry the metadata.
- VBI vertical blanking interval
- transport of VBI data is also typically provided in digital television systems to support features of an incumbent analogue video, and since the VBI data is frame related, the VBI data may be used to carry a frame reference from the video camera 110 to the computing device 204 .
- any VBI line may be used, but typically, VBI lines that are used for Vertical Interval Timecode or for Teletext lines may be employed.
- the one or more streams may include multiple streams, of respective resolutions and/or frame rates, of the raw video captured by the camera(s) 110 .
- the multiple streams may include a “primary” stream with a certain resolution and frame rate, corresponding to the raw video captured by the camera 110 , and one or more additional streams.
- An additional stream may be the same video stream as the “primary” stream but at a different resolution and/or frame rate, or a stream that captures a portion of the “primary” stream (e.g., cropped to include a portion of the field of view or pixels of the primary stream) at the same or different resolution and/or frame rate as the “primary” stream.
- the computing device 204 may send one or more instructions to the camera 110 .
- a technician may use a control console (not shown in FIG. 3 ) communicatively coupled to the computing device 204 to send an instruction 302 to the camera 110 .
- the instruction may be a PTZ instruction.
- the PTZ instruction 302 may request the pan/tilt base of the camera 110 to pan the camera 110 left.
- the computing device 204 may start the clock 306 to measure PTZ round trip latency.
- a video stream comprising a plurality of video frames 302 is transmitted from the video camera 110 to the computing device 204 via the recorder device 117 .
- the recorder device 117 in this example represents an intermediate device.
- the video camera 110 may perform the received instruction. For example, the camera 110 may adjust its position according to the received PTZ instruction 302 . Subsequent portions of the video stream should reflect adjusted field of view.
- the video camera 110 may insert metadata that acknowledges the received instruction into the next outgoing frame.
- the metadata may include at least one of the following information related to the video camera 110 : unique video camera identifier (ID), group video camera ID, one or more associated user IDs, geolocation information, timestamp of captured video, direction, inclination, angle, moment, velocity, acceleration, or other environmental parameters.
- ID unique video camera identifier
- group video camera ID one or more associated user IDs
- geolocation information geolocation information
- timestamp of captured video direction, inclination, angle, moment, velocity, acceleration, or other environmental parameters.
- the metadata embedded within the one or more frames represents an in-band channel.
- the system memory 22 may be any memory for storing data used herein and/or computer programs that are executable by the processor 21 .
- the system memory 22 may include volatile memory such as a random access memory (RAM) 25 and non-volatile memory such as a read only memory (ROM) 24 , flash memory, etc., or any combination thereof.
- RAM random access memory
- ROM read only memory
- BIOS basic input/output system
- BIOS basic input/output system
- the computer system 600 may include one or more storage devices such as one or more removable storage devices 27 , one or more non-removable storage devices 28 , or a combination thereof.
- the one or more removable storage devices 27 and non-removable storage devices 28 are connected to the system bus 23 via a storage interface 32 .
- the storage devices and the corresponding computer-readable storage media are power-independent modules for the storage of computer instructions, data structures, program modules, and other data of the computer system 100 .
- the system memory 22 , removable storage devices 27 , and non-removable storage devices 28 may use a variety of computer-readable storage media.
- Examples of computer-readable storage media include machine memory such as cache, static random access memory (SRAM), dynamic random access memory (DRAM), zero capacitor RAM, twin transistor RAM, enhanced dynamic random access memory (eDRAM), extended data output random access memory (EDO RAM), double data rate random access memory (DDR RAM), electrically erasable programmable read-only memory (EEPROM), NRAM, resistive random access memory (RRAM), silicon-oxide-nitride-silicon (SONOS) based memory, phase-change random access memory (PRAM); flash memory or other memory technology such as in solid state drives (SSDs) or flash drives; magnetic cassettes, magnetic tape, and magnetic disk storage such as in hard disk drives or floppy disks; optical storage such as in compact disks (CD-ROM) or digital versatile disks (DVDs); and any other medium which may be used to store the desired data and which can be accessed by the computer system 400 .
- SSDs solid state drives
- CD-ROM compact disks
- DVDs digital versatile disks
- the system memory 22 , removable storage devices 27 , and non-removable storage devices 28 of the computer system 600 may be used to store an operating system 35 , additional program applications 37 , other program modules 38 , and program data 39 .
- the computer system 600 may include a peripheral interface 46 for communicating data from input devices 40 , such as a keyboard, mouse, stylus, game controller, voice input device, touch input device, or other peripheral devices, such as a printer or scanner via one or more I/O ports, such as a serial port, a parallel port, a universal serial bus (USB), or another peripheral interface.
- a display device 47 such as one or more monitors, projectors, or integrated display, may also be connected to the system bus 23 across an output interface 48 , such as a video adapter.
- the computer system 600 may be equipped with other peripheral output devices (not shown), such as loudspeakers and other audiovisual devices
- the computer system 600 may operate in a network environment, using a network connection to one or more remote computers 49 .
- the remote computer (or computers) 49 may be local computer workstations or servers comprising most or all of the aforementioned elements in describing the nature of a computer system 600 .
- Other devices may also be present in the computer network, such as, but not limited to, routers, network stations, peer devices or other network nodes.
- the computer system 600 may include one or more network interfaces 51 or network adapters for communicating with the remote computers 49 via one or more networks such as a local-area computer network (LAN) 50 , a wide-area computer network (WAN), an intranet, and the Internet.
- Examples of the network interface 51 may include an Ethernet interface, a Frame Relay interface, SONET interface, and wireless interfaces.
- the computer system 600 includes a hardware processor configured to receive, via one or more intermediate devices in the CCTV system, a video stream generated by a sensor device of the CCTV system.
- the video stream includes a plurality of video frames.
- the hardware processor is also configured to: send, via the one or more intermediate devices of the CCTV system, an instruction to a sensor device configured to generate a video stream including a plurality of video frames and receive, via the one or more intermediate devices of the CCTV system, one or more frames of the plurality video frames embedded with metadata associated with performance of the instruction by the sensor device.
- the hardware processor is further configured to evaluate performance of the CCTV system using the metadata embedded within the one or more video frames.
- a video source 702 may send video data to a video destination 706 (e.g., a downstream device/client such as the computing device 204 in FIG. 3 ) through one or more intermediate devices 704 (e.g., the recorder 117 in FIG. 3 ).
- the intermediate devices 704 may forward the video data from the video source 702 to the video destination 706 but may otherwise not participate in the communication/handling of any metadata that is embedded by the video source 702 in each video frame of the video data.
- the intermediate devices 704 may also perform some transformations on the video data.
- the intermediate devices 704 may change the video data resolution, transcode the video data from one codec to another, etc.
- the metadata and/or the transformations performed by the intermediate devices 704 are configured such that the metadata that is embedded in the video frames survives the transformations performed by the intermediate devices 704 .
- an intermediate device 704 may change the codec used for encoding the video frames.
- the intermediate device 704 may decode the video frame and then re-encode the decoded video frame, e.g., may decode each video frame from H264 and then re-encode the video frame as H265.
- the actual picture in each video frame is still the same picture but just encoded with a different codec. Accordingly, the picture of a QR code that carries the metadata in the video frame may survive this transcoding, hence the video destination 706 that is receiving the video frame can recognize each QR code embedded in the video frames and can extract the data from the QR code.
- Some further aspects provide augmented in-band video communication. Specifically, in addition to the video source 702 being able to communicate with the video destination 706 via in-band metadata (e.g., a QR code per video frame), some present aspects extend in-band communication such that one or more of the intermediate devices 704 can amend the metadata, e.g., change the metadata, add to the metadata, subtract from the metadata, etc.
- in-band metadata e.g., a QR code per video frame
- a compatible intermediate device 710 e.g., an intermediate device that has the capability to manipulate the metadata embedded in the video frames, can participate in the in-band one-way communication.
- the compatible intermediate device 710 may decode the inbound metadata, then relay all/none/part of the decoded metadata along with adding/replacing/amending metadata of the compatible intermediate device 710 .
- a compatible intermediate device 710 may identify a QR code that includes the metadata in a video frame, remove the QR code, and then insert a different/modified QR code in the same place in the video frame. In doing so, the compatible intermediate device 710 may include diagnostic information of the compatible intermediate device 710 into the modified QR code.
- Metadata-capable intermediate devices have the ability to communicate with any compatible downstream device within the chain of video processing equipments.
- some present aspects provide a way for the compatible intermediate device 710 to regenerate the metadata for placement in the outbound video stream.
- the path from the video source 702 to the video destination 706 may include one or more incompatible intermediate devices 708 and one or more compatible intermediate devices 710 .
- the incompatible intermediate devices 708 may merely forward the video data without modifying the embedded metadata, while the one or more compatible intermediate devices 710 may modify the embedded metadata before forwarding the video data.
- some present aspects allow for additional communication features so that compatible intermediate devices can receive data from any upstream device, send data to any downstream device, and inject their own statistics and diagnostic information such as video quality data on the inbound and outbound sides, latency timing data (to help highlight bottlenecks in the video transmission chain), etc.
- incompatible intermediate devices will be transparent to this metadata communication channel.
- each compatible intermediate device may include a QR code decoder and encoder. After decoding the incoming metadata that is embedded within a video frame, the compatible intermediate device may then edit/add/replace the metadata with the metadata of the compatible intermediate device. The new metadata is then encoded and placed in each outbound video frame. This may require encoding of the outbound video (e.g., H.264 encoding), which the compatible intermediate device may already be doing (e.g., transcoding the video to a lower resolution to save outbound bandwidth).
- outbound video e.g., H.264 encoding
- some present aspects provide a one-way communication “bus” with multiple participants which include not only a source and a destination but also intermediate devices that may no longer be transparent to the in-band metadata communicated over the one-way communication bus.
- out-of-band channels may also allow devices in a chain of video equipment to communicate.
- metadata that is tightly coupled to each video frame, synchronizing the metadata with the corresponding frame may become difficult when metadata is communicated over out-of-band channels.
- present aspects are described herein with reference to a CCTV system, the present aspects are not so limited, and are applicable to other video transport systems or video management systems.
- the present aspects are applicable to TV broadcasting or IPTV streaming where the embedded metadata reaches a collection of destinations.
- the communications instead of a chain of devices, the communications fans out to multiple endpoints, and a hierarchy of devices exists, with video being sent to multiple destinations at multiple stages.
- the communication is still one-way, but the topology becomes a tree rather than a linear chain of devices.
- the present aspects are also applicable for delivering software updates or security keys to a large collection of devices, such as, for example, set top boxes, one in each customer's house.
- FIGS. 8 and 9 are flowcharts of example methods 800 and 900 for video management within a video transport system, which may be or may include, but is not limited to, a CCTV system, a video management system, a broadcast TV system, etc.
- Each one of methods 800 or 900 may be performed by an apparatus such as the computer system 600 implementing all or a portion of an applicable device described herein, such as the computing device 204 , the recorder 117 , the video destination 706 , the compatible intermediate device 708 , etc.
- the computing device 204 , the video destination 706 , the computer system 600 , or the processor 21 may be configured to or may comprise means for receiving, by an intermediate device in the CCTV system, one or more frames generated by a sensor device in the CCTV system and configured for reception by a computer device via one or more intermediate devices within the CCTV system, wherein the one or more frames are embedded with first metadata associated with a first performance of the sensor device.
- the sensor device is configured to embed, within the one or more frames, a first 2-dimensional bar code representing the first metadata associated with the first performance of the sensor device, wherein the first 2-dimensional bar code occupies at least a portion of the one or more frames.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (18)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/313,980 US12363261B2 (en) | 2021-07-08 | 2023-05-08 | In-band video communication |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/370,975 US11671567B2 (en) | 2021-07-08 | 2021-07-08 | In-band video communication |
| US18/313,980 US12363261B2 (en) | 2021-07-08 | 2023-05-08 | In-band video communication |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/370,975 Continuation-In-Part US11671567B2 (en) | 2021-07-08 | 2021-07-08 | In-band video communication |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230283750A1 US20230283750A1 (en) | 2023-09-07 |
| US12363261B2 true US12363261B2 (en) | 2025-07-15 |
Family
ID=87850194
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/313,980 Active US12363261B2 (en) | 2021-07-08 | 2023-05-08 | In-band video communication |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US12363261B2 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20260006157A1 (en) * | 2024-06-26 | 2026-01-01 | Toshiba Global Commerce Solutions, Inc. | Associating and configuring a network-attached camera |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080288986A1 (en) * | 2005-03-04 | 2008-11-20 | Armida Technologies | Wireless integrated security controller |
| US10136166B1 (en) * | 2014-12-08 | 2018-11-20 | The Directv Group, Inc. | Method and system for communicating inserted material to a client device in a centralized content distribution system |
| US10212306B1 (en) * | 2016-03-23 | 2019-02-19 | Amazon Technologies, Inc. | Steganographic camera communication |
| US20200285226A1 (en) * | 2019-03-06 | 2020-09-10 | Johnson Controls Technology Company | Systems and methods for sensor diagnostics and management |
| US20200334470A1 (en) * | 2017-10-25 | 2020-10-22 | Hitachi, Ltd. | Methods and apparatus for automated surveillance systems |
| US20210168346A1 (en) * | 2019-12-02 | 2021-06-03 | Comcast Cable Communications, Llc | Methods and systems for condition mitigation |
-
2023
- 2023-05-08 US US18/313,980 patent/US12363261B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080288986A1 (en) * | 2005-03-04 | 2008-11-20 | Armida Technologies | Wireless integrated security controller |
| US10136166B1 (en) * | 2014-12-08 | 2018-11-20 | The Directv Group, Inc. | Method and system for communicating inserted material to a client device in a centralized content distribution system |
| US10212306B1 (en) * | 2016-03-23 | 2019-02-19 | Amazon Technologies, Inc. | Steganographic camera communication |
| US20200334470A1 (en) * | 2017-10-25 | 2020-10-22 | Hitachi, Ltd. | Methods and apparatus for automated surveillance systems |
| US20200285226A1 (en) * | 2019-03-06 | 2020-09-10 | Johnson Controls Technology Company | Systems and methods for sensor diagnostics and management |
| US20210168346A1 (en) * | 2019-12-02 | 2021-06-03 | Comcast Cable Communications, Llc | Methods and systems for condition mitigation |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230283750A1 (en) | 2023-09-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12184716B2 (en) | Delivering content in multiple formats | |
| US9954717B2 (en) | Dynamic adaptive streaming over hypertext transfer protocol as hybrid multirate media description, delivery, and storage format | |
| CA2656826C (en) | Embedded appliance for multimedia capture | |
| US20140010517A1 (en) | Reduced Latency Video Streaming | |
| US11671567B2 (en) | In-band video communication | |
| US12363261B2 (en) | In-band video communication | |
| US20080122986A1 (en) | Method and system for live video production over a packeted network | |
| US11350161B2 (en) | Digital video recorder with additional video inputs over a packet link | |
| KR101352860B1 (en) | Multi stream system and multi stream display method thereof | |
| US8706843B2 (en) | Network connector device | |
| EP3171606B1 (en) | Information processing device and information processing method | |
| AU2019204751B2 (en) | Embedded appliance for multimedia capture | |
| CN116545758A (en) | Conference audio and video summary processing encryption storage system | |
| CA2914803C (en) | Embedded appliance for multimedia capture | |
| Zhang et al. | Integrated Civil Monitoring System Based on POSA | |
| HK1136063B (en) | Embedded appliance for multimedia capture | |
| AU2013254937A1 (en) | Embedded Appliance for Multimedia Capture |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: CONTROLLED ELECTRONIC MANAGEMENT SYSTEMS LIMITED, GREAT BRITAIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FEE, PAUL;REEL/FRAME:065420/0492 Effective date: 20210625 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| AS | Assignment |
Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOHNSON CONTROLS TYCO IP HOLDINGS LLP;REEL/FRAME:068494/0384 Effective date: 20240201 Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:JOHNSON CONTROLS TYCO IP HOLDINGS LLP;REEL/FRAME:068494/0384 Effective date: 20240201 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |