WO2006044789A2 - Video monitoring application, device architectures, and system architecture - Google Patents
Video monitoring application, device architectures, and system architecture Download PDFInfo
- Publication number
- WO2006044789A2 WO2006044789A2 PCT/US2005/037235 US2005037235W WO2006044789A2 WO 2006044789 A2 WO2006044789 A2 WO 2006044789A2 US 2005037235 W US2005037235 W US 2005037235W WO 2006044789 A2 WO2006044789 A2 WO 2006044789A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- network
- application
- mobile
- imaging
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 89
- 238000000034 method Methods 0.000 claims abstract description 65
- 238000003384 imaging method Methods 0.000 claims description 72
- 230000011664 signaling Effects 0.000 claims 9
- 238000007906 compression Methods 0.000 description 44
- 238000012545 processing Methods 0.000 description 43
- 230000006835 compression Effects 0.000 description 42
- 238000003860 storage Methods 0.000 description 25
- 238000013139 quantization Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 23
- 238000005516 engineering process Methods 0.000 description 20
- 230000005540 biological transmission Effects 0.000 description 18
- 230000003044 adaptive effect Effects 0.000 description 16
- 230000008901 benefit Effects 0.000 description 16
- 230000006837 decompression Effects 0.000 description 10
- 238000007726 management method Methods 0.000 description 9
- 238000012806 monitoring device Methods 0.000 description 8
- 230000002123 temporal effect Effects 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 239000000969 carrier Substances 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 7
- 239000008186 active pharmaceutical agent Substances 0.000 description 6
- 238000013461 design Methods 0.000 description 6
- 238000009826 distribution Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 239000000047 product Substances 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 239000013589 supplement Substances 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000003252 repetitive effect Effects 0.000 description 4
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 229920001690 polydopamine Polymers 0.000 description 3
- 241000700605 Viruses Species 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 235000013339 cereals Nutrition 0.000 description 2
- 238000005352 clarification Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013467 fragmentation Methods 0.000 description 1
- 238000006062 fragmentation reaction Methods 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012432 intermediate storage Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000003921 oil Substances 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F15/00—Digital computers in general; Data processing equipment in general
- G06F15/16—Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
- G06F15/163—Interprocessor communication
- G06F15/173—Interprocessor communication using an interconnection network, e.g. matrix, shuffle, pyramid, star, snowflake
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
- G08B13/19658—Telephone systems used to communicate with a camera, e.g. PSTN, GSM, POTS
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19667—Details realated to data compression, encryption or encoding, e.g. resolution modes for reducing data volume to lower transmission bandwidth or memory requirements
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19684—Portable terminal, e.g. mobile phone, used for viewing video remotely
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M7/00—Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/66—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission for reducing bandwidth of signals; for improving efficiency of transmission
Definitions
- the present invention relates to data compression, and more particularly to still image and video image recording in video monitoring systems, corresponding device architectures, and system architectures for transmitting, storing, editing, processing, and transcoding still images over wireless and wired networks and viewing them on display- enabled devices as well as distributing and updating codecs across networks and devices.
- the intent of the transform stage in a video compressor is to gather the energy or information of the source picture into as compact a form as possible by taking advantage of local similarities and patterns in the picture or sequence.
- Compressors are designed to work well on "typical” inputs and can ignore their failure to compress "random” or “pathological” inputs.
- Many image compression and video compression methods such as MPEG-2 and MPEG-4, use the discrete cosine transform (DCT) as the transform stage.
- DCT discrete cosine transform
- Quantization typically discards information after the transform stage.
- the reconstructed decompressed image then is not an exact reproduction of the original.
- Entropy coding is generally a lossless step: this step takes the information remaining after quantization and usually codes it so that it can be reproduced exactly in the decoder. Thus the design decisions about what information to discard in the transform and quantization stages is typically not affected by the following entropy- coding stage.
- a limitation of DCT-based video compression/decompression (codec) techniques is that, having been developed originally for video broadcast and streaming applications, they rely on the encoding of video content in a studio environment, where high-complexity encoders can be run on computer workstations. Such computationally complex encoders allow computationally simple and relatively inexpensive decoders (players) to be installed in consumer playback devices.
- Such asymmetric encode/decode technologies are not ideal for many emerging video monitoring devices and applications in which video messages are captured and encoded in real time in devices with limited computational resources.
- the instant invention presents solutions to the shortcomings of prior art compression techniques and provides a highly sophisticated yet computationally highly efficient image compression (codec) that can be implemented as an all-software (or hybrid) application on mobile handsets, still image and video monitoring cameras, reducing the complexity of the device architecture and the complexity of the imaging service platform architecture.
- codec image compression
- aspects of the present invention's all-software or hybrid video codec solution substantially reduces or eliminates baseband processor and video accelerator costs and requirements in multimedia handsets and cameras.
- the present invention in all-software or hybrid solutions substantially reduces the complexity, risk, and cost of both handset or camera device development and video monitoring service architecture and deployment.
- software video transcoders enable automated over-the-network (OTN) upgrade of deployed MMS control (MMSC) infrastructure as well as deployment or upgrade of codecs to mobile handsets and camera devices.
- the present invention's wavelet transcoders provide carriers with complete interoperability between the wavelet video format and other standards-based and proprietary video formats.
- the present all- software or hybrid video platform allows rapid deployment of new MMS services that leverage processing speed and video production accuracy not available with prior art technologies.
- the present wavelet codecs are also unique in their ability to efficiently process both still images and video, and can thus replace separate codec formats with a single lower-cost and lower-power solution that can simultaneously support both still images and video images in monitoring applications, as well as in other services.
- FIG. 1 illustrates an architecture of a video monitoring system using analog CCTV cameras.
- Fig. 2 illustrates an architecture of a video monitoring system using digital video cameras and IP network.
- FIG. 3 illustrates an architecture of a video monitoring system using analog cameras with external digital video codecs and IP network interface.
- FIG. 4 illustrates an architecture of a digital video monitoring system using a wireless device with integrated video display capability.
- Fig. 5 illustrates an architecture of a digital monitoring camera with integrated IP network interface.
- Fig. 6 illustrates physical display size and resolution differences between common video display formats.
- Fig. 7 illustrates a mobile imaging handset architecture.
- Fig. 8 illustrates a mobile imaging service platform architecture.
- Fig. 9 illustrates a system for joint source-channel coding.
- Fig. 10 schematically compares the differences in processing resources between a DCT encoder and an improved wavelet encoder of the present invention.
- Fig. 11 illustrates an improved system for joint source-channel coding.
- Fig. 12 illustrates an improved architecture of a digital monitoring camera with integrated IP network interface.
- Fig. 13 illustrates an improved mobile imaging handset platform architecture.
- Fig. 14 illustrates an improved video monitoring system architecture using digital network cameras with integrated wavelet-based codec, imaging application, and joint source-channel coding.
- Fig. 15 illustrates an improved video monitoring system architecture using analog cameras and external wavelet-based codec, imaging application and joint source-channel coding.
- Fig. 16 illustrates an improved video monitoring system architecture using a video-enabled wireless device with integrated wavelet-based codec, imaging application, and joint source-channel coding.
- Fig. 17 illustrates an improved mobile imaging service platform architecture using a video-enabled wireless device with integrated wavelet-based codec, imaging application, and joint source-channel coding.
- Fig. 18 illustrates an over-the-network upgrade of a deployed multimedia messaging service controller video gateway.
- Fig. 19 illustrates implementation options for a software imaging application in a network camera or wireless handset.
- Fig. 20 illustrates implementation options for a hardware-accelerated imaging application in a network camera or wireless handset.
- Fig. 21 illustrates implementation options for a hybrid hardware-accelerated and software imaging application in a network camera or wireless handset.
- Fig. 22 illustrates an improved content delivery platform for management and delivery of wavelet-compressed images, videos, and integrated multimedia messaging service messages, and provisioning of multimedia messaging album applications.
- a wavelet transform comprises the repeated application of wavelet filter pairs to a set of data, either in one dimension or in more than one.
- a 2-D wavelet transform horizontal and vertical
- Video codecs can use a 3-D wavelet transform (horizontal, vertical, and temporal).
- An improved, symmetrical 3-D wavelet-based video compression/decompression (codec) device is desirable to reduce the computational complexity and power consumption in video monitoring devices and applications to a level well below those required for DCT-based codecs, as well as to enable simultaneous support for processing still images and video images in a single codec.
- Such simultaneous support for still images and video images in a single codec would eliminate the need for separate MPEG (video) and JPEG (still image) codecs, or greatly improve compression performance and hence storage efficiency with respect to Motion JPEG codecs.
- the architecture of many deployed video monitoring systems typically consists of one or more analog closed-circuit TV (CCTV) cameras 110 remotely connected to one or more hard disk recorder (HDR) units 120.
- CCTV closed-circuit TV
- HDR hard disk recorder
- Images can be viewed either locally from the HDR 120, for example in a central video monitoring facility, or transmitted over a dedicated or shared network connection 140 for remote viewing, allowing any number of authorized viewers to view real-time or recorded video images simultaneously.
- some newer video monitoring systems utilize digital IP cameras 210.
- Such cameras 210 enable digitization and compression of video signals directly in the camera 210, and the compressed video can then be transmitted directly from the camera 210 over Internet Protocol (IP) networks 220 to PC or server-based devices for remote storage, viewing, and further analysis.
- IP Internet Protocol
- Such devices can include video monitoring devices 230, video storage devices 240, video analysis devices 250, video processing devices 260 and/or video distribution devices 270, each connected to networked PCs and/or servers 280.
- Fig. 3 in order to support the upgrade of legacy video monitoring systems using analog CCTV cameras 310, it is also possible to provide stand-alone digital codecs 312 and IP network interfaces 314 to the analog CCTV cameras 310 for interconnection with devices such as 330, 340, 350, 360, 370 and 380, analogous to the devices described above in reference to Fig. 2.
- some newer video monitoring systems enable access to and viewing of digital compressed video over the network 412 using fixed or mobile wireless devices 410 equipped with video display capabilities.
- video display capabilities it would be desirable to enable real-time capture of video in wireless devices connected to video monitoring networks having such components as 414, 420, 430, 440, 450, 460, 470 and 480 described above in reference to Figs. 2 and 3.
- a Digital Video Monitoring Camera 510 is a video surveillance system that digitizes and compresses the analog video and audio to minimize transmission bandwidth and storage 512 requirements.
- Such a camera 510 may also include an integrated IP network interface 514 that permits the camera 510 to stream video across IP-protocol networks 516, such as Local Area Networks (LANs) 518, without the expense of bulky coaxial cables.
- IP-protocol networks 516 such as Local Area Networks (LANs) 518
- the core subsystems of such a digital video monitoring camera 510 include: • Lens subsystem 520
- Imaging array CCD or CMOS
- Analog Processing and A/D Conversion 524 performs pre-amplification, signal conditioning, and analog-to-digital (A/D) signal conversion circuitry connected to or integrated with analog imager array for input to the digital processing.
- A/D analog-to-digital
- Digital Processing 526 performs motion compensation and other similar real ⁇ time image capture processing, color space conversion, compression/decompression, and post processing such as image scaling and trans-rating.
- Processing Memory 528 stores executing code and data/parameters.
- Interfacing Logic and Controllers 530 provides interfacing to integrated storage and display, as to local external display monitors and other display/processing devices such as PCs
- Network Interface 514 provides data packetization for data communication over the IP network 516, and transmits and receives voice/video data packets through the IP network 516.
- Audio Interface interfaces with microphone/speaker and uses audio codec to digitize the audio signal.
- Power Conversion converts input power from an AC adaptor or battery supply to run various functional blocks.
- Voice/video data may be stored using built-in or removable memory, and/or transmitted via non-real-time file transfer or real ⁇ time streaming over the IP network.
- SubQCIF 610 (SubQ-common intermediate format) is 128 pixels (picture elements) wide by 96 pixels high
- QQVGA 620 (QQ-Vector graphics array) is 160 by 120 pixels
- QCIF 630 is 176 by 144 pixels
- QVGA 640 is 320 by 240 pixels
- CIF 650 is 352 by 288 pixels
- VGA 660 is 640 by 480 pixels
- the largest current format, D1/HDTV 670 (high-definition television), is 720 by 480 pixels.
- Video monitoring applications typically require capture/display video images in VGA 660 (640x480 pixels) or D1 670 (720 x 480) format or larger, at a display rate of 30 frames-per-second (fps) or higher, whereas commercially available digital video monitoring cameras are typically limited to capturing video images in CIF 650 (352x288) format or QCIF 630 format (176x144 pixels) or smaller, at a display rate of 15 fps or lower.
- This reduced video capture capability is due to the excessive processor power consumption and buffer memory required to complete the number, type, and sequence of computational steps associated with video compression/decompression using DCT transforms.
- codec functions may be implemented using such RISC processors, DSPs 1 ASICs, and RPDs as separate integrated circuits (ICs), or may combine one or more of the RISC processors, DSPs, ASICs, and RPDs integrated together in a system-in-a-package (SIP) or system-on-a-chip (SoC).
- SIP system-in-a-package
- SoC system-on-a-chip
- An all-software implementation of such an improved video codec and imaging application according to aspects of the present invention is also desirable for the capability to be downloaded to, installed in, "bug-fixed", and upgraded in already- deployed digital monitoring cameras.
- Such an improved video codec and imaging application would also be desirable as an external device, in order to support the upgrade of legacy video monitoring systems using analog CCTV cameras 110 or 310.
- wireless video monitoring includes the addition of digital camera functionality (still images) or camcorder functionality (video images) to mobile handsets, so that users can both capture (encode) video messages that they wish to send, and play back (decode) video messages that they receive.
- digital camcorder functionality to mobile handsets may involve adding the following functions, either in hardware, software, or as a combination of hardware and software:
- imager array 710 typically array of CMOS or CCD pixels, with corresponding pre-amps and analog-to-digital (AJD) signal conversion circuitry
- image processing functions 712 such as pre-processing, encoding/decoding (codec), post-processing
- imaging-enabled mobile handsets are limited to capturing smaller-size and lower-frame-rate video images than those typically required for video monitoring application.
- Video monitoring applications typically require capture/display of video images in VGA 660 (640x480 pixels) or D1 670 (720 x 480) format or larger, at a display rate of 30 frames-per-second (fps) or higher, whereas commercially available imaging- enabled mobile handsets are limited to capturing video images in QCIF 630 format (176x144 pixels) or smaller, at a display rate of 15 fps or lower.
- This reduced video capture capability is due to the excessive processor power consumption and buffer memory required to complete the number, type, and sequence of computational steps associated with video compression/decompression using DCT transforms. Even with this reduced video capture capability of commercially available mobile handsets, specially designed integrated circuit chips have been needed to be built into the handset hardware to accomplish the compression and decompression.
- RISC processors 724, DSPs 726, ASICs 728, and RPDs 730 as separate integrated circuits (ICs), or might combine one or more of the RISC processors 724, DSPs 726, ASICs 728, and RPDs 730 integrated together in a system-in-a-package (SIP) or system-on-a-chip (SoC).
- SIP system-in-a-package
- SoC system-on-a-chip
- Codec functions running on RISC processors 724 or DSPs 726 in conjunction with the above hardware can be software routines, with the advantage that they can be modified in order to correct errors or upgrade functionality.
- the disadvantage of implementing certain complex, repetitive codec functions as software is that the resulting overall processor resource and power consumption requirements typically exceed those available in mobile communications devices.
- Codec functions running on ASICs 728 are typically fixed hardware implementations of complex, repetitive computational steps, with the advantage that specially tailored hardware acceleration can substantially reduce the overall power consumption of the codec.
- codec functions running on RPDs 730 are typically routines that require both hardware acceleration and the ability to add or modify functionality in final mobile imaging handset products.
- the disadvantage of implementing certain codec functions on RPDs 730 is the larger number of silicon gates and higher power consumption required to support hardware reconfigurability in comparison to fixed ASIC 728 implementations.
- An imaging application constructed according to some aspects of the present invention reduces or eliminates complex, repetitive codec functions so as to enable wireless video monitoring handsets to capture VGA 660 (or larger) video at a frame rate of 30 fps with an all-software architecture. This arrangement simplifies the above architecture and enables handset costs compatible with high-volume commercial deployment.
- New multimedia handsets may also be required not only to support picture and video messaging capabilities, but also a variety of additional multimedia capabilities (voice, music, graphics) and wireless access modes (2.5G and 3G cellular access, wireless LAN, Bluetooth, GPS).
- additional multimedia capabilities voice, music, graphics
- wireless access modes 2.5G and 3G cellular access, wireless LAN, Bluetooth, GPS.
- OTA over-the-air
- the all-software imaging application provided by aspects of the present invention enables OTA distribution and management of the imaging application in wireless video monitoring devices connected to commercial or private wireless networks.
- key components of a typical mobile wireless network capable of supporting imaging services such as video monitoring may include:
- BSC/RNC Basestation Controller/Radio Network Controller
- MSC Mobile Switching Center
- GSN Gateway Service Node
- MMSC Mobile Multimedia Service Controller
- MMSC MMSC
- the video gateway 822 in an MMSC 820 serves to transcode between the different video formats that are supported by the imaging service platform. Transcoding is also utilized by wireless operators to support different voice codecs used in mobile telephone networks, and the corresponding voice transcoders are integrated into the RNC 814. Upgrading such a mobile imaging service platform with the architecture shown in Figure 8 includes deploying new handsets 810, and manually adding new hardware to the MMSC 820 video gateway 822.
- An all-software mobile imaging applications service platform constructed according to aspects of the present invention supports automated OTA upgrade of deployed handsets 810, and automated OTN upgrade of deployed MMSCs 820, in order to support deployment of new video monitoring services and applications.
- Video transmission over mobile wireless networks represents one extreme challenge because of the higher data rates typically required, in comparison to the transmission of other data/media types such as text, audio, and still images.
- the limited and varying channel bandwidth, along with the fluctuating noise and error characteristics of mobile networks impose further constraints and difficulties on video transport.
- various joint source-channel coding techniques can be applied to adapt the video bit stream to different channel conditions (see Figure 9).
- the joint source-channel coding approach of the present invention can be scalable, so as to adapt to varying channel bandwidths and error characteristics.
- it supports scalability for multicast scenarios, in which different devices at the receiving end of the video stream may have different limitations on decoding computational power and display capabilities.
- the source video sequence 910 is first source coded (i.e. compressed) by source encoder 920, followed by error correction code (ECC) channel coding 930.
- ECC error correction code
- source coding typically uses such DCT-based compression techniques as, H.263, MPEG-4, or Motion JPEG. Such coding techniques could not be adjusted as can that of the present invention to provide real time adjustment of the degree of compression carried out in the source encoder.
- This aspect of the present invention provides significant advantages particularly when video is being captured, encoded and transmitted through the communications network in real or hear real time (as compared to embodiments in which the video is captured, encoded and stored for later transmission).
- Exemplary channel coding methods are Reed-Solomon codes, BCH codes, FEC codes, and turbo codes.
- the joint source and channel coded video bit stream then passes through the rate controller 940 to match the channel bandwidth requirement while achieving the best reconstructed video quality.
- the rate controller performs discrete rate-distortion computations on the compressed video bit stream before it sends the video bit stream 950 for transmission over the channel 960. Due to limitations in computational power in mobile devices, typical rate controllers only consider the available channel bandwidth, and do not explicitly consider the error characteristics of the transmission channel 960.
- the source encoder has the capability of adjusting the compression so as to achieve variations in the compression ratio as small as from 1 to 5% and also from 1 to 10 %. This is particularly enabled when varied compression factors are applied to separate subbands of data that together represent the data of one or more video images.
- the joint source-channel coded bitstream 950 is received over channel 960 and ECC channel decoded in step 970, source decoded in step 980 to render reconstructed video 990.
- the present invention provides improved adaptive joint-source channel coding based on algorithms with higher computational efficiency, so that both instantaneous and predicted channel bandwidth and error conditions can be utilized in all three of the source coder 920, the channel coder 930, and the rate controller 940 to maximize control of both the instantaneous and average quality (video rate vs. distortion) of the reconstructed video signal 990.
- the improved adaptive joint-source channel coding technique provided by the present invention further allows wireless carriers and MMS service providers the ability to offer a greater range of quality-of-service (QoS) performance and pricing levels to their consumer and enterprise customers, thus maximizing the revenues generated using their wireless network infrastructure.
- QoS quality-of-service
- Multicast scenarios require a single adaptive video bit stream that can be decoded by many users. This is especially important in modern, large-scale, heterogeneous networks, in which network bandwidth limitations make it impractical to transmit multiple simulcast video signals specifically tuned for each user. Multicasting of a single adaptive video bit stream greatly reduces the bandwidth requirements, but requires generating a video bit stream that is decodable for multiple users, including high-end users with broadband wireless or wire line connections, and wireless phone users, with limited bandwidth and error-prone connections. Due to limitations in computational power in mobile devices, the granularity of adaptive rate controllers is typically very coarse, for example producing only a 2-layer bit stream including a base layer and one enhancement layer.
- Another advantage provided by the present invention's improved adaptive joint- source channel coding based on algorithms with higher computational efficiency is that it enables support for a much higher level of network heterogeneity, in terms of channel types (wireless and wire line), channel bandwidths, channel noise/error characteristics, user devices, and user services.
- Java technology brings a wide range of devices, from servers and desktop computers to network cameras and mobile devices, together under one language and one technology. While the applications for this range of devices differ, Java technology works to bridge those differences where it counts, allowing developers who are functional in one area to leverage their skills across the spectrum of devices and applications.
- J2ME Java 2, Micro Edition
- Standard Edition J2SE
- Enterprise Edition J2EE
- Micro Edition J2ME was introduced for developers working devices with limited hardware resources, such as PDAs, cell phones, pagers, television set top boxes, networked cameras, remote telemetry units, and many other consumer electronic and embedded devices.
- J2ME is aimed at machines with as little as 128KB of RAM and with processors a lot less powerful than those used on typical desktop and server machines.
- J2ME actually consists of a set of profiles. Each profile is defined for a particular type of device - cell phones, PDAs, etc. - and consists of a minimum set of class libraries required for the particular type of device and a specification of a Java virtual machine required to support the device.
- the virtual machine specified in any J2ME profile is not necessarily the same as the virtual machine used in Java 2 Standard Edition (J2SE) and Java 2 Enterprise Edition (J2EE).
- J2SE Java 2 Standard Edition
- J2EE Java 2 Enterprise Edition
- Sun identified within each of these two categories classes of devices with similar roles. For example, all cell phones fell within one class, regardless of manufacturer. With the help of its partners in the Java Community Process (JCP), Sun then defined additional functionality specific to each vertical slice.
- JCP Java Community Process
- a configuration is a Java virtual machine (JVM) and a minimal set of class libraries and APIs providing a run-time environment for a select group of devices.
- JVM Java virtual machine
- a configuration specifies a least common denominator subset of the Java language, one that fits within the resource constraints imposed by the family of devices for which it was developed. Because there is such great variability across user interface, function, and usage, even within a configuration, a typical configuration doesn't define such important pieces as the user interface toolkit and persistent storage APIs. The definition of that functionality belongs, instead, to what is called a profile.
- a J2ME profile is a set of Java APIs specified by an industry-led group that is meant to address a specific class of device, such as pagers and cell phones. Each profile is built on top of the least common denominator subset of the Java language provided by its configuration, and is meant to supplement that configuration.
- Two profiles important to mobile handheld devices are: the Foundation profile, which supplements the CDC, and the Mobile Information Device Profile (MIDP), which supplements the CLDC. More profiles are in the works, and specifications and reference implementations should begin to emerge soon.
- JSR 185 The Java Technology for the Wireless Industry (JTWI) specification, JSR 185, defines the industry-standard platform for the next generation of Java technology- enabled mobile phones. JTWI is defined through the Java Community Process (JCP) by an expert group of leading mobile device manufacturers, wireless carriers, and software vendors. JTWI specifies the technologies that must be included in all JTWI-compliant devices: CLDC 1.0 (JSR 30), MIDP 2.0 (JSR 118), and WMA 1.1 (JSR 120), as well as CLDC 1.1 (JRS 139) and MMAPI (JSR 135) where applicable.
- JSR-135 Mobile Media API
- JSR-234 Advanced Multimedia Supplements
- JTWI JTWI specification raises the bar of functionality for high-volume devices, while minimizing API fragmentation and broadening the substantial base of applications that have already been developed for mobile phones.
- Benefits of JTWI include:
- Road map A key feature of the JTWI specification is the road map, an outline of common functionality that software developers can expect in JTWI- compliant devices. January 2003 saw the first in a series of road maps expected to appear at six- to nine-month intervals, which will describe additional functionality consistent with the evolution of mobile phones. The road map enables all parties to plan for the future with more confidence: carriers can better plan their application deployment strategy, device manufacturers can better determine their product plans, and content developers can see a clearer path for their application development efforts. Carriers in particular will, in the future, rely on a Java VM to abstract/protect underlying radio/network functions from security breaches such as viruses, worms, and other "attacks" that currently plaque the public Internet.
- the previously described video codec and imaging application for video monitoring is Java-based to allow for "write- once, run-anywhere" portability across a broad range of Java-enabled digital video cameras and wireless handsets, as well as for Java VM security and device/network robustness against viruses, worms, and other mobile network security "attacks", and simplified OTA codec download procedures.
- the Java- based imaging application conforms to JTWI specifications JSR-135 ("Mobile Media API") and JSR-234 ("Advanced Multimedia Supplements").
- 3-D wavelet transforms can be exploited to design video compression/decompression (codec) devices 1010 much lower in computational complexity than DCT-based codecs 1020 (see Figure 10).
- processing resources used by such processes as color recovery and demodulation 1030, image transformation 1040, memory 1050, motion estimation 1060 / temporal transforms 1070, and quantization, rate control and entropy coding 1080 can be significantly reduced by utilizing 3-D wavelet codecs according to some aspects of the present invention.
- the application of a wavelet transform stage also enables design of quantization and entropy-coding stages with greatly reduced computational complexity.
- Further advantages of the 3-D wavelet codecs 1010 according to certain aspects of the present invention developed for mobile imaging applications, devices, and services include: • Symmetric, low-complexity video encoding and decoding
- Wavelet transforms using short dyadic integer filter coefficients in the lifting structure for example, the Haar, 2-6, and 5-3 wavelets and variations of them can be used. These use only adds, subtracts, and small fixed shifts - no multiplication or floating-point operations are needed.
- Lifting Scheme computation The above filters can advantageously be computed using the Lifting Scheme which allows in-place computation. A full description of the Lifting Scheme can be found in Sweldens, Wim, The Lifting Scheme: A custom-design construction of biorthogonal wavelets. Appl. Comput. Harmon. Anal. 3(2): 186-200, 1996, incorporated herein by reference in its entirety. Implementing the Lifting Scheme in this application minimizes use of registers and temporary RAM locations, and keeps references local for highly efficient use of caches.
- Wavelet transforms in pyramid form with customized pyramid structure each level of the wavelet transform sequence can advantageously be computed on half of the data resulting from the previous wavelet level, so that the total computation is almost independent of the number of levels.
- the pyramid can be customized to leverage the advantages of the Lifting Scheme above and further economize on register usage and cache memory bandwidth.
- Block structure in contrast to most wavelet compression implementations, the picture can advantageously be divided into rectangular blocks with each block being processed separately from the others. This allows memory references to be kept local and an entire transform pyramid can be done with data that remains in the processor cache, saving a lot of data movement within most processors. Block structure is especially important in hardware embodiments as it avoids the requirement for large intermediate storage capacity in the signal flow.
- Block boundary filters modified filter computations can be advantageously used at the boundaries of each block to avoid sharp artifacts, as described in applicants' U.S. Application No. 10/418,363, filed April 17, 2003, published as 2003/0198395 and entitled WAVELET TRANSFORM SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT, incorporated herein by reference in its entirety.
- Chroma temporal removal in certain embodiments, processing of the chroma- difference signals for every field can be avoided, instead using a single field of chroma for a GOP. This is described in applicants' U.S. Application No. 10/447,514, filed May 28, 2003, published as 2003/0235340 and entitled CHROMA TEMPORAL RATE REDUCTION AND HIGH-QUALITY PAUSE SYSTEM AND METHOD, incorporated herein by reference in its entirety.
- Temporal compression using 3D wavelets in certain embodiments, the very computationally expensive motion-search and motion-compensation operations of conventional video compression methods such as MPEG are not used. Instead, a field- to-field temporal wavelet transform can be computed. This is much less expensive to compute. The use of short integer filters with the Lifting Scheme here is also preferred.
- Dyadic quantization in certain embodiments, the quantization step of the compression process is accomplished using a binary shift operation uniformly over a range of coefficient locations. This avoids the per-sample multiplication or division required by conventional quantization.
- the amount of data to be handled by the entropy coder is reduced by first doing a run-of-zeros conversion.
- a method of counting runs of zeros on parallel processing architectures is used, as described in applicants' U.S. Application No. 10/447,455, filed May 28, 2003, published as 2003/0229773 and entitled PILE PROCESSING SYSTEM AND METHOD FOR PARALLEL PROCESSORS, incorporated herein by reference in its entirety. Note that most modern processing platforms have some parallel capability that can be exploited in this way.
- Cycle-efficient entropy coding in certain embodiments, the entropy coding step of the compression process is done using techniques that combine the traditional table lookup with direct computation on the input symbol. Characterizing the symbol distribution in source still images or video leads to the use of such simple entropy coders as Rice-Golomb, exp-Golomb or the Dyadic Monotonic. The choice of entropy coder details will often vary depending on the processor platform capabilities. Details of the Rice-Golomb and exp-Golomb coders are described in: Golomb, S.W. (1966), "Run- length encodings", IEEE Transactions on Information Theory, IT— 12(3):399 — 401 ; R. F.
- One method of adjusting the amount of compression, the rate of output bits produced, is to change the amount of information discarded in the quantization stage of the computation. Quantization is conventionally done by dividing each coefficient by a pre-chosen number, the "quantization parameter", and discarding the remainder of the division. Thus a range of coefficient values comes to be represented by the same single value, the quotient of the division.
- the inverse quantization process step multiplies the quotient by the (known) quantization parameter. This restores the coefficients to their original magnitude range for further computation.
- division (or equivalent ⁇ multiplication) is an expensive operation in many implementations, in terms of power and time consumed, and in hardware cost. Note that the quantization operation is applied to every coefficient, and that there are usually as many coefficients as input pixels.
- a pile is a data storage structure in which data are represented with sequences of zeros (or of other common values) compressed. It should be noted that a subband may comprise several separate piles or storage areas. Alternately, a pile or storage area may comprise several separate subbands.
- the fine grain scalability of the improved wavelet- based codec described above enables improved adaptive rate control, multicasting, and joint source-channel coding.
- the reduced computational complexity and higher computational efficiency of the improved wavelet algorithms allows information on both instantaneous and predicted channel bandwidth and error conditions to be utilized in all three of the source coder 1120, the channel coder 1130, and the rate controller 1140 to maximize control of both the instantaneous and average compression rates which affect the quality (video rate vs. distortion) of the reconstructed video signal 1190 (see Figure 11).
- available transmission bandwidth between a mobile device 810 and a cellular transmission tower 812 shown in Fig.
- the compression rate can be adjusted by making real time processing changes in either the source encoder 1120, the channel encoder 1130 or the rate controller 1140, or with changes to a combination of these elements.
- Example rate change increments can vary from 1 to 5%, from 1 to 10%, from 1 to 15%, from 1 to 25%, and from 1 to 40%
- the improved adaptive joint-source channel coding technique allows video monitoring network operators, wireless carriers, and MMS service providers to offer a greater range of quality-of-service (QoS) performance and pricing levels to their customers.
- QoS quality-of-service
- Utilizing improved adaptive joint-source channel coding based on algorithms with higher computational efficiency enables support for a much higher level of network heterogeneity, in terms of channel types (wireless and wire line), channel bandwidths, channel noise/error characteristics, user devices, and user services.
- the reduced computational complexity of the video codec also enables reductions in the complexity of corresponding video processing and analysis applications. Such applications can then be integrated much more readily together with the video codec using the limited computational resources available in network cameras and wireless handsets.
- FIG. 12 illustrates an improved digital video monitoring camera architecture 1210 according to aspects of the present invention, with components similar to those in Fig. 5 labeled with similar reference numerals.
- the imaging application can be implemented as an all-software application running as native code or as a Java application on a RISC processor 1226 or DSP. Acceleration of the Java code operation may be implemented within the RISC processor 1226 itself, or using a separate Java accelerator IC. Such a Java accelerator may be implemented as a stand-alone IC, or this IC may be integrated with other functions in either a SIP or SoC.
- the improved digital video monitoring camera architecture illustrated in Figure 12 greatly reduces the computational and buffer memory 1228 requirements for the video codec and imaging application, supports processing of both still images and video, enables reductions in the complexity of corresponding video processing and analysis applications, enables such applications to be integrated with the video codec using the limited computational resources available in the network camera, and enables adaptive joint source-channel coding in support of connectivity over a much more heterogeneous range of network architectures 1232 and infrastructure equipment.
- Figure 13 illustrates an improved mobile imaging handset platform architecture.
- the imaging application is implemented as an all-software application running as native code or as a Java application on a RISC processor 1324. Acceleration of the Java code operation may be implemented within the RISC processor 1324 itself, or using a separate Java accelerator IC 1332. Such a Java accelerator 1332 may be implemented as a stand-alone IC, or this IC may be integrated with other functions in either a SIP or SoC.
- the improved mobile imaging handset platform architecture illustrated in Figure 13 greatly reduces the computational and buffer memory 1314 requirements for the video codec and imaging application, supports processing of both still images and video, enables reductions in the complexity of corresponding video processing and analysis applications, enables such applications to be integrated with the video codec using the limited computational resources available in the network camera, and enables adaptive joint source-channel coding in support of connectivity over a much more heterogeneous range of network architectures and infrastructure equipment.
- Figure 14 illustrates an improved video monitoring system architecture using digital network cameras 1410 with integrated wavelet-based codec, imaging application, and joint source-channel coding.
- This architecture allows video monitoring network operators to take advantage of new, more flexible, lower-cost, and higher-speed digital network transmission, storage, and processing technologies.
- Figure 15 illustrates an improved video monitoring system architecture using analog cameras 1510 and external wavelet-based codec, imaging application, joint source-channel coding, and networking interfaces. This architecture allows video monitoring network operators to upgrade legacy video monitoring systems using analog CCTV cameras.
- Figure 16 illustrates an improved video monitoring system architecture using video-enabled wireless device(s) 1610 with integrated wavelet-based codec, imaging application, and joint source-channel coding.
- This architecture allows video monitoring network operators to enable real-time capture, storage, display, transmission, reception, processing, and analysis of video over wireless devices connected to video monitoring networks.
- key components of an improved mobile wireless network capable of supporting imaging services such as video monitoring can include:
- BSC/RNC Basestation Controller/Radio Network Controller
- GSN Gateway Service Node
- MMSC Mobile Multimedia Service Controller
- Typical functions included in the MMSC can include:
- the video gateway 1722 in an MMSC 1720 serves to transcode between the different video formats that are supported by the imaging service platform. Transcoding is already utilized by wireless operators to support different voice codecs used in mobile telephone networks, and the corresponding voice transcoders are integrated into the RNC 1714.
- the steps involved in deploying the improved imaging service platform include: [0116] Step L
- Step 2 Signal the network that the Video Gateway Transcoder application 1730 is available for updating on the deployed Video Gateways 1722.
- the download server 1721 signals the video gateways 1722 on the network of this availability.
- Mobile Video Imaging Application 1734 e.g. an updated video codec
- Mobile Video Imaging Application 1734 If accepted by user, and transaction settlement is completed successfully, download and install Mobile Video Imaging Application 1734 to wireless handset 1710 or digital monitoring camera 1710' via OTA 1736 procedures. Just the encoder portion, just the decoder portion, or both the encoder and decoder portions of the Mobile Video Imaging Application 1734 can be downloaded and installed.
- the all-software wavelet-based video codec and imaging application, joint source-channel coding, network camera architecture, and wireless handset architecture are combined in the above wireless video monitoring service platform architecture to deliver higher digital video image quality using lower-cost and lower-complexity network cameras and wireless devices, reduced service deployment complexity, risk and costs via OTA/OTN deployment, and enable operation over a much more heterogeneous range of network architectures and infrastructure equipment.
- the improved wavelet-based video codec and imaging application, joint source- channel coding, network camera architecture, wireless handset architecture, video monitoring network architectures, and wireless video monitoring service platform architecture described above also enable the deployment of the improved video monitoring and video messaging applications and services described below.
- Multimedia Messaging Album including, but not limited to, the following modules:
- MMS Composer integrates improved wavelet-compressed images and videos with sounds and text in one message.
- Mobile Media Album repository for wavelet-compressed images, videos, and integrated MMS messages.
- Multimedia Ring-Tone Composer Create personal polyphonic ring tones that combine sound with wavelet-compressed images and videos.
- the improved wavelet-based video codec and imaging application, joint source- channel coding, network camera architecture, wireless handset architecture, video monitoring network architectures, wireless video monitoring service platform architecture, and video messaging/monitoring applications and services described here achieve the goal of delivering higher quality digital video and images using network cameras and wireless devices with lower cost and complexity, reduced service deployment costs, and operation over a much more heterogeneous range of network architectures and infrastructure equipment.
- the imaging application can be installed via OTA download to the baseband multimedia processing section of the camera or handset, to a removable storage device, or to the imaging module or other location. Where desirable, the imaging application can also be installed during manufacturing or at point-of-sale to the baseband multimedia processing section of the camera or handset, to a removable storage device, or to the imaging module or other location. Additional implementation options are also possible as network camera and mobile device architectures evolve.
- Performance of the network camera or mobile imaging handset may be further improved, and costs and power consumption may be further reduced, by accelerating some computational elements via hardware-based processing resources in order to take advantage of ongoing advances in mobile device computational hardware (ASIC, DSP, RPD) and integration technologies (SoC, SIP).
- ASIC mobile device computational hardware
- SoC integration technologies
- SIP SIP
- All-hardware options can be considered for integrating these hardware-based processing resources in the camera or handset (see Figure 20), including the baseband multimedia processing section, a removable storage device, or the imaging module.
- hybrid architectures for the imaging application may offer enhancements by implementing some computationally intensive, repetitive, fixed functions in hardware, and implementing in software those functions for which post- manufacturing modification may be desirable or required.
- the data representing a particular compressed video can be transmitted over the telecommunications network to the MMSC and that the data can have attached to it a decoder for the compressed video.
- the video Gateway that is otherwise necessary to transcoder video data coming in to the MMSC.
- the receiving wireless device for example 1710, can receive the compressed video with attached decoder and simply play the video on the platform of the receiving device 1710. This provides a significant efficiency and cost savings in the structure of the MMSC and its operations.
- the wavelet processing can be designed to accomplish additional video processing functions on the video being processed.
- the wavelet processing can be designed to accomplish color space conversion, black/white balance, image stabilization, digital zoom, brightness control, and resizing as well as other functions.
- Another particular advantage of aspects of the present invention lies in the significantly improved voice synchronization accomplished.
- the voice is synchronized to every other frame of video.
- MPEG4 only synchronizes voice to every 15th frame. This results in significant de-synchronization of voice with video, particularly when imperfect transmission of video is accomplished as commonly occurs over mobile networks.
- having voice synchronized to every other frame of video when that video is embodied in the MMSC provides for efficient and expedited editing of the video in the MMSC where such may be done in programs such as automatic or remotely enabled video editing.
- aspects of the present invention are presented in as much as the present encoding techniques allow the embedding of significantly more, or significantly more easily embedded, metadata in the video being generated and compressed.
- Such metadata can include, among other items, the time, the location where the video was captured (as discerned from the location systems in the mobile handset) and the user making the film. Furthermore, because there is a reference frame in every other frame of video in certain embodiments of the present invention, as compared to a reference frame in every 15 frames of video in MPEG-4 compressed video, embodiments of the present invention provide highly efficient searching of video and editing of video as well as providing much improved audio synchronization.
- An improved wavelet-based video codec and imaging application, joint source- channel coding, network camera architecture, wireless handset architecture, video monitoring network architectures, wireless video monitoring/messaging service platform architecture, and video messaging/monitoring applications and services are provided by various aspects of the present invention. These improvements combine to substantially reduce the technical complexity and costs related with offering high-quality still and video monitoring applications and services for retail businesses, banks, schools, enterprises, government offices, airports, transportation departments, military installations, and many other organizations.
- Improved adaptive joint source-channel coding allows video monitoring network operators, wireless carriers, and MMS service providers to offer a greater range of quality-of-service (QoS) performance and pricing levels to their customers, thus maximizing the revenues generated using their network infrastructure.
- QoS quality-of-service
- Improved adaptive joint-source channel coding based on algorithms with higher computational efficiency, enables support for a much higher level of network homogeneity, in terms of channel types (wireless and wire line), channel bandwidths, channel noise/error characteristics, infrastructure equipment, user devices, and user services.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Mathematical Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Closed-Circuit Television Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2005295466A AU2005295466A1 (en) | 2004-10-13 | 2005-10-13 | Video monitoring application, device architectures, and system architecture |
JP2007536992A JP2008516566A (en) | 2004-10-13 | 2005-10-13 | Video monitoring application, device architecture and system architecture |
CA002583745A CA2583745A1 (en) | 2004-10-13 | 2005-10-13 | Video monitoring application, device architectures, and system architecture |
EP05808873A EP1800404A4 (en) | 2004-10-13 | 2005-10-13 | Video monitoring application, device architectures, and system architecture |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US61893804P | 2004-10-13 | 2004-10-13 | |
US60/618,938 | 2004-10-13 | ||
US65405805P | 2005-02-16 | 2005-02-16 | |
US60/654,058 | 2005-02-16 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006044789A2 true WO2006044789A2 (en) | 2006-04-27 |
WO2006044789A3 WO2006044789A3 (en) | 2008-05-08 |
Family
ID=36203620
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2005/037235 WO2006044789A2 (en) | 2004-10-13 | 2005-10-13 | Video monitoring application, device architectures, and system architecture |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP1800404A4 (en) |
JP (1) | JP2008516566A (en) |
KR (1) | KR20070085317A (en) |
AU (1) | AU2005295466A1 (en) |
CA (1) | CA2583745A1 (en) |
WO (1) | WO2006044789A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009542046A (en) * | 2006-06-16 | 2009-11-26 | ドロップレット テクノロジー インコーポレイテッド | Video processing and application system, method and apparatus |
CN102377989A (en) * | 2011-10-12 | 2012-03-14 | 广东志成冠军集团有限公司 | Area end network-based radio monitoring camera device |
US9596293B2 (en) | 2010-09-08 | 2017-03-14 | Panasonic Intellectual Property Management Co., Ltd. | Content transmission device and network node |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11622091B2 (en) * | 2019-10-31 | 2023-04-04 | Genetec Inc. | Facilitating video monitoring of at least one location |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6356543B2 (en) * | 1997-11-25 | 2002-03-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Controlling mobile phone system user views from the world-wide web |
US6754894B1 (en) * | 1999-12-03 | 2004-06-22 | Command Audio Corporation | Wireless software and configuration parameter modification for mobile electronic devices |
JP2001285420A (en) * | 2000-03-24 | 2001-10-12 | Telefon Ab L M Ericsson | Mobile radio communication equipment, communication system, and printer |
US20020112237A1 (en) * | 2000-04-10 | 2002-08-15 | Kelts Brett R. | System and method for providing an interactive display interface for information objects |
US20010030667A1 (en) * | 2000-04-10 | 2001-10-18 | Kelts Brett R. | Interactive display interface for information objects |
JP2002056340A (en) * | 2000-08-09 | 2002-02-20 | Konami Co Ltd | Game item providing system, its method, and recording medium |
JP2004038941A (en) * | 2002-04-26 | 2004-02-05 | Matsushita Electric Ind Co Ltd | Content adaptation method for terminal device, server and gateway of universal multimedia framework |
JP4272395B2 (en) * | 2002-08-09 | 2009-06-03 | 株式会社Access | Information providing method, information providing system, management device and software program using address information providing device independent of communication network |
JP2004118291A (en) * | 2002-09-24 | 2004-04-15 | Hitachi Kokusai Electric Inc | Software management system and failure management device |
WO2004044710A2 (en) * | 2002-11-11 | 2004-05-27 | Supracomm, Inc. | Multicast videoconferencing |
US7430602B2 (en) * | 2002-12-20 | 2008-09-30 | Qualcomm Incorporated | Dynamically provisioned mobile station and method therefor |
-
2005
- 2005-10-13 KR KR1020077010760A patent/KR20070085317A/en not_active Application Discontinuation
- 2005-10-13 EP EP05808873A patent/EP1800404A4/en not_active Withdrawn
- 2005-10-13 AU AU2005295466A patent/AU2005295466A1/en not_active Abandoned
- 2005-10-13 JP JP2007536992A patent/JP2008516566A/en active Pending
- 2005-10-13 WO PCT/US2005/037235 patent/WO2006044789A2/en active Application Filing
- 2005-10-13 CA CA002583745A patent/CA2583745A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of EP1800404A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009542046A (en) * | 2006-06-16 | 2009-11-26 | ドロップレット テクノロジー インコーポレイテッド | Video processing and application system, method and apparatus |
US9596293B2 (en) | 2010-09-08 | 2017-03-14 | Panasonic Intellectual Property Management Co., Ltd. | Content transmission device and network node |
CN102377989A (en) * | 2011-10-12 | 2012-03-14 | 广东志成冠军集团有限公司 | Area end network-based radio monitoring camera device |
Also Published As
Publication number | Publication date |
---|---|
CA2583745A1 (en) | 2006-04-27 |
EP1800404A2 (en) | 2007-06-27 |
AU2005295466A1 (en) | 2006-04-27 |
JP2008516566A (en) | 2008-05-15 |
EP1800404A4 (en) | 2009-01-07 |
KR20070085317A (en) | 2007-08-27 |
WO2006044789A3 (en) | 2008-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8896717B2 (en) | Methods for deploying video monitoring applications and services across heterogeneous networks | |
US20060072837A1 (en) | Mobile imaging application, device architecture, and service platform architecture | |
US8849964B2 (en) | Mobile imaging application, device architecture, service platform architecture and services | |
AU2005295132A1 (en) | Mobile imaging application, device architecture, and service platform architecture | |
US8644632B2 (en) | Enhancing image quality | |
US8665943B2 (en) | Encoding device, encoding method, encoding program, decoding device, decoding method, and decoding program | |
US8254707B2 (en) | Encoding device, encoding method, encoding program, decoding device, decoding method, and decoding program in interlace scanning | |
EP2084907B1 (en) | Method and system for scalable bitstream extraction | |
US20140368672A1 (en) | Methods for Deploying Video Monitoring Applications and Services Across Heterogeneous Networks | |
EP1800404A2 (en) | Video monitoring application, device architectures, and system architecture | |
CN101390392A (en) | Video monitoring application, device architectures, and system architecture | |
WO2006089254A2 (en) | Mobile imaging application, device architecture, service platform architecture and services | |
WO2009002321A1 (en) | Enhancing image quality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200580038858.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2583745 Country of ref document: CA Ref document number: 2007536992 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005808873 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077010760 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2005808873 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005295466 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 2005295466 Country of ref document: AU Date of ref document: 20051013 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2005295466 Country of ref document: AU |