CN1722839A - Video coding and coding/decoding method and video encoder and decoder - Google Patents

Video coding and coding/decoding method and video encoder and decoder Download PDF

Info

Publication number
CN1722839A
CN1722839A CNA2005100841402A CN200510084140A CN1722839A CN 1722839 A CN1722839 A CN 1722839A CN A2005100841402 A CNA2005100841402 A CN A2005100841402A CN 200510084140 A CN200510084140 A CN 200510084140A CN 1722839 A CN1722839 A CN 1722839A
Authority
CN
China
Prior art keywords
motion vector
layer motion
base layer
video
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2005100841402A
Other languages
Chinese (zh)
Other versions
CN100466735C (en
Inventor
河昊振
韩宇镇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN1722839A publication Critical patent/CN1722839A/en
Application granted granted Critical
Publication of CN100466735C publication Critical patent/CN100466735C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/10Architectures or entities
    • H04L65/102Gateways
    • H04L65/1043Gateway controllers, e.g. media gateway control protocol [MGCP] controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1639Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/305Authentication, i.e. establishing the identity or authorisation of security principals by remotely controlling device operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6209Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/74Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information operating in dual or compartmented mode, i.e. at least one secure mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/88Detecting or preventing theft or loss
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10009Improvement or modification of read or write signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10009Improvement or modification of read or write signals
    • G11B20/10305Improvement or modification of read or write signals signal quality assessment
    • G11B20/10398Improvement or modification of read or write signals signal quality assessment jitter, timing deviations or phase and frequency errors
    • G11B20/10425Improvement or modification of read or write signals signal quality assessment jitter, timing deviations or phase and frequency errors by counting out-of-lock events of a PLL
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03LAUTOMATIC CONTROL, STARTING, SYNCHRONISATION, OR STABILISATION OF GENERATORS OF ELECTRONIC OSCILLATIONS OR PULSES
    • H03L7/00Automatic control of frequency or phase; Synchronisation
    • H03L7/06Automatic control of frequency or phase; Synchronisation using a reference signal applied to a frequency- or phase-locked loop
    • H03L7/08Details of the phase-locked loop
    • H03L7/085Details of the phase-locked loop concerning mainly the frequency- or phase-detection arrangement including the filtering or amplification of its output signal
    • H03L7/091Details of the phase-locked loop concerning mainly the frequency- or phase-detection arrangement including the filtering or amplification of its output signal the phase or frequency detector using a sampling device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/25Arrangements specific to fibre transmission
    • H04B10/2575Radio-over-fibre, e.g. radio frequency signal modulated onto an optical carrier
    • H04B10/25752Optical arrangements for wireless networks
    • H04B10/25753Distribution optical network, e.g. between a base station and a plurality of remote units
    • H04B10/25754Star network topology
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/24Radio transmission systems, i.e. using radiation field for communication between two or more posts
    • H04B7/26Radio transmission systems, i.e. using radiation field for communication between two or more posts at least one of which is mobile
    • H04B7/2628Radio transmission systems, i.e. using radiation field for communication between two or more posts at least one of which is mobile using code-division multiple access [CDMA] or spread spectrum multiple access [SSMA]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J13/00Code division multiplex systems
    • H04J13/0077Multicode, e.g. multiple codes assigned to one user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J13/00Code division multiplex systems
    • H04J13/16Code allocation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0056Systems characterized by the type of code used
    • H04L1/0064Concatenated codes
    • H04L1/0066Parallel concatenated codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0056Systems characterized by the type of code used
    • H04L1/0067Rate matching
    • H04L1/0068Rate matching by puncturing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/46Interconnection of networks
    • H04L12/4641Virtual LANs, VLANs, e.g. virtual private networks [VPN]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L25/00Baseband systems
    • H04L25/02Details ; arrangements for supplying electrical power along data transmission lines
    • H04L25/03Shaping networks in transmitter or receiver, e.g. adaptive shaping networks
    • H04L25/03006Arrangements for removing intersymbol interference
    • H04L25/03012Arrangements for removing intersymbol interference operating in the time domain
    • H04L25/03019Arrangements for removing intersymbol interference operating in the time domain adaptive, i.e. capable of adjustment during data reception
    • H04L25/03038Arrangements for removing intersymbol interference operating in the time domain adaptive, i.e. capable of adjustment during data reception with a non-recursive structure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L25/00Baseband systems
    • H04L25/38Synchronous or start-stop systems, e.g. for Baudot code
    • H04L25/40Transmitting circuits; Receiving circuits
    • H04L25/49Transmitting circuits; Receiving circuits using code conversion at the transmitter; using predistortion; using insertion of idle bits for obtaining a desired frequency spectrum; using three or more amplitude levels ; Baseband coding techniques specific to data transmission systems
    • H04L25/4902Pulse width modulation; Pulse position modulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L25/00Baseband systems
    • H04L25/38Synchronous or start-stop systems, e.g. for Baudot code
    • H04L25/40Transmitting circuits; Receiving circuits
    • H04L25/49Transmitting circuits; Receiving circuits using code conversion at the transmitter; using predistortion; using insertion of idle bits for obtaining a desired frequency spectrum; using three or more amplitude levels ; Baseband coding techniques specific to data transmission systems
    • H04L25/4904Transmitting circuits; Receiving circuits using code conversion at the transmitter; using predistortion; using insertion of idle bits for obtaining a desired frequency spectrum; using three or more amplitude levels ; Baseband coding techniques specific to data transmission systems using self-synchronising codes, e.g. split-phase codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L25/00Baseband systems
    • H04L25/38Synchronous or start-stop systems, e.g. for Baudot code
    • H04L25/40Transmitting circuits; Receiving circuits
    • H04L25/49Transmitting circuits; Receiving circuits using code conversion at the transmitter; using predistortion; using insertion of idle bits for obtaining a desired frequency spectrum; using three or more amplitude levels ; Baseband coding techniques specific to data transmission systems
    • H04L25/497Transmitting circuits; Receiving circuits using code conversion at the transmitter; using predistortion; using insertion of idle bits for obtaining a desired frequency spectrum; using three or more amplitude levels ; Baseband coding techniques specific to data transmission systems by correlative coding, e.g. partial response coding or echo modulation coding transmitters and receivers for partial response systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L27/00Modulated-carrier systems
    • H04L27/10Frequency-modulated carrier systems, i.e. using frequency-shift keying
    • H04L27/14Demodulator circuits; Receiver circuits
    • H04L27/156Demodulator circuits; Receiver circuits with demodulation using temporal properties of the received signal, e.g. detecting pulse width
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/06Management of faults, events, alarms or notifications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/15Flow control; Congestion control in relation to multipoint traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/72Admission control; Resource allocation using reservation actions during connection setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/74Admission control; Resource allocation measures in reaction to resource unavailability
    • H04L47/745Reaction in network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/76Admission control; Resource allocation using dynamic resource allocation, e.g. in-call renegotiation requested by the user or requested by the network in response to changing network conditions
    • H04L47/765Admission control; Resource allocation using dynamic resource allocation, e.g. in-call renegotiation requested by the user or requested by the network in response to changing network conditions triggered by the end-points
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/82Miscellaneous aspects
    • H04L47/822Collecting or measuring resource availability data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/82Miscellaneous aspects
    • H04L47/824Applicable to portable or mobile terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/48Message addressing, e.g. address format or anonymous messages, aliases
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/58Message adaptation for wireless communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L61/00Network arrangements, protocols or services for addressing or naming
    • H04L61/09Mapping addresses
    • H04L61/25Mapping addresses of the same type
    • H04L61/2503Translation of Internet protocol [IP] addresses
    • H04L61/255Maintenance or indexing of mapping tables
    • H04L61/2553Binding renewal aspects, e.g. using keep-alive messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/12Applying verification of the received information
    • H04L63/126Applying verification of the received information the source of the received data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1101Session protocols
    • H04L65/1104Session initiation protocol [SIP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • H04L9/085Secret sharing or secret splitting, e.g. threshold schemes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/30Public key, i.e. encryption algorithm being computationally infeasible to invert or user's encryption keys not requiring secrecy
    • H04L9/304Public key, i.e. encryption algorithm being computationally infeasible to invert or user's encryption keys not requiring secrecy based on error correction codes, e.g. McEliece
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M7/00Arrangements for interconnection between switching centres
    • H04M7/12Arrangements for interconnection between switching centres for working between exchanges having different types of switching equipment, e.g. power-driven and step by step or decimal and non-decimal
    • H04M7/1205Arrangements for interconnection between switching centres for working between exchanges having different types of switching equipment, e.g. power-driven and step by step or decimal and non-decimal where the types of switching equipement comprises PSTN/ISDN equipment and switching equipment of networks other than PSTN/ISDN, e.g. Internet Protocol networks
    • H04M7/1295Details of dual tone multiple frequency signalling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00912Arrangements for controlling a still picture apparatus or components thereof not otherwise provided for
    • H04N1/00957Compiling jobs, e.g. for batch processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32106Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title separate from the image data, e.g. in a different computer file
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • H04N19/139Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/625Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/91Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • H04N21/2543Billing, e.g. for subscription services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/418External card to be used in combination with the client device, e.g. for conditional access
    • H04N21/4181External card to be used in combination with the client device, e.g. for conditional access for conditional access
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4623Processing of entitlement messages, e.g. ECM [Entitlement Control Message] or EMM [Entitlement Management Message]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47211End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting pay-per-view content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6175Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6187Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via a telephone network, e.g. POTS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/4448Receiver circuitry for the reception of television signals according to analogue transmission standards for frame-grabbing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/46Receiver circuitry for the reception of television signals according to analogue transmission standards for receiving on more than one standard at will
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0112Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards corresponding to a cinematograph film standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/162Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing
    • H04N7/163Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing by receiver means only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17327Transmission or handling of upstream communications with deferred transmission or handling of upstream communications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3129Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/642Multi-standard receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/7921Processing of colour television signals in connection with recording for more than one processing mode
    • H04N9/7925Processing of colour television signals in connection with recording for more than one processing mode for more than one standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • H04W4/14Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/04TPC
    • H04W52/30TPC using constraints in the total amount of available transmission power
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/12Setup of transport tunnels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2105Dual mode as a secondary aspect
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2115Third party
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/22Signal processing not specific to the method of recording or reproducing; Circuits therefor for reducing distortions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/08Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
    • H04L43/0823Errors, e.g. transmission errors
    • H04L43/0829Packet loss
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/10Architectures or entities
    • H04L65/1016IP multimedia subsystem [IMS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/42221Conversation recording systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3222Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of processing required or performed, e.g. forwarding, urgent or confidential handling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/38Transmitter circuitry for the transmission of television signals according to analogue transmission standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/66Transforming electric information into light information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/0122Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal the input and the output signals having different aspect ratios
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/16Central resource management; Negotiation of resources or communication parameters, e.g. negotiating bandwidth or QoS [Quality of Service]
    • H04W28/18Negotiating wireless communication parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/16Central resource management; Negotiation of resources or communication parameters, e.g. negotiating bandwidth or QoS [Quality of Service]
    • H04W28/26Resource reservation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W40/00Communication routing or communication path finding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W72/00Local resource management
    • H04W72/20Control channels or signalling for resource management
    • H04W72/23Control channels or signalling for resource management in the downlink direction of a wireless link, i.e. towards a terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • H04W8/24Transfer of terminal data
    • H04W8/245Transfer of terminal data from a network towards a terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/26Network addressing or numbering for mobility support
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/08Access point devices
    • H04W88/085Access point devices with remote components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/16Gateway arrangements
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S370/00Multiplex communications
    • Y10S370/901Wide area network
    • Y10S370/902Packet switching
    • Y10S370/903Osi compliant network
    • Y10S370/906Fiber data distribution interface, FDDI
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S370/00Multiplex communications
    • Y10S370/901Wide area network
    • Y10S370/902Packet switching
    • Y10S370/903Osi compliant network
    • Y10S370/907Synchronous optical network, SONET

Abstract

Video coding and coding/decoding method and video encoder and decoder that motion scalability can be provided are provided.This method for video coding comprises: base layer motion vector and the enhancement layer motion vector of estimating each piece in the frame of video; Use the time redundancy in the enhancement layer motion vector removal frame of video; Spatial alternation is removed the frame of video of time redundancy therein, and the frame of video of quantification spatial alternation is to obtain texture information; One of enhancement layer motion vector of selecting the base layer motion vector that estimates of each piece and estimating; Be included as the motion vector of each piece selection and the bit stream of texture information with generation.

Description

Video coding and coding/decoding method and video encoder and decoder
Technical field
Apparatus and method according to the invention relates to video coding, more particularly, relates to the video coding that motion scalability (scalability) is provided.
Background technology
Along with the development of the ICT (information and communication technology) that comprises the Internet, video communication and text and voice communication increase rapidly.Traditional textcommunication can not satisfy various user's requests, therefore, can provide the multimedia service such as the various types of information of text, picture and music to obtain increasing.Because the amount of multi-medium data is bigger usually with respect to the data of other type, so the wider bandwidth that multi-medium data needs large-capacity storage media and is used to transmit.Therefore, need be used to send the compaction coding method of the multi-medium data that comprises text, video and audio frequency.For example, 24 true color images with resolution of 640 * 480 need every frame 640 * 480 * 24 bits, the capacity of the data of promptly about 7.37 megabits.When sending with the speed of per second 30 frames, need the bandwidth of 221 mbit/such as such image.When storage during, need the memory space of about 1200 gigabits (Gbit) based on 90 minutes films of such image.Therefore, compaction coding method comprises that for transmission the multi-medium data of text, video and audio frequency is essential.
In such compaction coding method, the basic principle of data compression is to remove data redundancy.Data redundancy is normally defined: (i) spatial redundancy, and in spatial redundancy, identical color or object are repeated in image; (ii) time redundancy in time redundancy, does not almost have variation or identical sound to be repeated in audio frequency between the consecutive frame of moving image; Perhaps (iii) psychovisual sensation redundancy, it is considered for high frequency slow coach class eyesight and sensation.By removing such data redundancy compressible data.Whether data compression can lose to be divided into according to source data substantially diminishes/lossless compress, whether compressed independently according to single frame and to be divided in the frame/the interframe compression, and according to required time of compression whether with recover required time and identically be divided into symmetry/asymmetric compression.In addition, data compression is defined as Real Time Compression when postpone to be no more than 50ms compression/recovery time, and is defined as scalable (scalable) compression when frame has different resolution.For example, for text or medical data, use lossless compress usually.For multi-medium data, use lossy compression method usually.
Fig. 1 is the block diagram of conventional video encoder 100.
With reference to Fig. 1, conventional video encoder 100 comprises: exercise estimator 110 is used to estimate the motion between the frame of video; Motion compensator 120 is used to remove the time redundancy within the frame of video; Space transformer 130 is carried out spatial alternation to remove spatial redundancy; Quantizer 140 is used to quantize to remove the frame of spatial redundancy; Movable information encoder 160; With bit stream generator 150, be used to produce bit stream.
More particularly, exercise estimator 110 finds and will be used to remove by the motion of compensation present frame the motion vector of time redundancy.Motion vector is defined as best matching blocks from reference frame with respect to the displacement of the piece in the present frame, and this is described with reference to Fig. 2.Though original video frame can be used as reference frame, a lot of known video coding techniques are used by the original video frame decoding is obtained reconstruction frames as the reference frame.
Motion compensator 120 uses the motion vector that is calculated by exercise estimator 110 to remove the time redundancy that is present in the present frame.For this reason, motion compensator 120 uses reference frame and motion vector to produce predictive frame and present frame and predictive frame is compared, and produces residual frame thus.
Space transformer 130 spatial alternation residual frame are to obtain conversion coefficient.The most normally used spatial alternation algorithm is discrete cosine transform (DCT).Recently, wavelet transformation is adopted widely.
The conversion coefficient that quantizer 140 quantizes by space transformer 130 acquisitions.Quantify strength is determined according to bit rate.
The motion vector encoder that 160 pairs of movable information encoders are calculated by exercise estimator 110 to be reducing data volume, and produces the movable information that is included in the bit stream.
Bit stream generator 150 produces the bit stream of the motion vector that comprises quantized transform coefficients and coding.Although do not show in Fig. 1, in such as MPEG-2, MPEG-4 and conventional video encoding scheme H.264, quantized transform coefficients is not inserted directly in the bit stream.But the texture information of creating after scanning, flexible (scaling) and entropy coding (texture information) is comprised in the bit stream.
Fig. 2 represents traditional motion estimation process and the temporal mode that uses during estimation.
Motion estimation process uses block matching algorithm basically and is performed.Piece in the region of search in the mobile reference frame with present frame in piece relatively, and calculate the difference of these two pieces and be used for cost (cost) motion vector encoder.Make the piece in the minimized reference frame of cost be selected as the optimum Match reference block.Although full search guarantees the optimum performance in the estimation, the calculated load that this process need is too much.In current widely used video coding, search of three steps or layering variable size block coupling (HVSBM) are normally used for estimation.Three kinds of time frame inter modes that are used for estimation are arranged: forward, backward with two-way.The conventional video encoding scheme is used inter-frame forecast mode and the intra prediction mode that is used to from the information of present frame.
The scalable video scheme of using motion compensation to remove time redundancy provides high video compression efficiency at the bit rate of abundance.Yet, because traditional scheme has reduced to distribute to the bit number of the texture information that comprises in this bit stream in the identical bit number of distributing to the movable information that comprises in the bit stream that produces by video coding of maintenance, traditional scheme has low compression efficient at low bit rate.When carrying out the conventional video encoding scheme with low-down bit rate, resultant bitstream can comprise texture information seldom, perhaps, under opposite extreme situations, only comprises movable information.Suffer the remarkable reduction of video quality when therefore, the conventional video that is difficult to reduce movable information is coded in low bit rate.Therefore, need distribute to the algorithm that the bit number of the movable information in the bit stream designs for adjustment.
Summary of the invention
The invention provides video coding and coding/decoding method and video encoder and the decoder that to adjust the amount of bits of distributing to movable information.
According to an aspect of the present invention, provide a kind of method for video coding, having comprised: base layer motion vector and the enhancement layer motion vector of estimating each piece in the frame of video; Use the time redundancy in the enhancement layer motion vector removal frame of video; Spatial alternation is removed the frame of video of time redundancy therein, and the frame of video of quantification spatial alternation is to obtain texture information; One of enhancement layer motion vector of selecting the base layer motion vector that estimates of each piece and estimating; Be included as the motion vector of each piece selection and the bit stream of texture information with generation.
According to a further aspect in the invention, provide a kind of method for video coding, having comprised: base layer motion vector and the enhancement layer motion vector of estimating each piece in the frame of video; Use the time redundancy in the enhancement layer motion vector removal frame of video; Spatial alternation is removed the frame of video of time redundancy therein, and the frame of video of quantification spatial alternation is to obtain texture information; Comprise the base layer motion vector that estimates of each piece with generation, as the residual motion vector of the difference of the base layer motion vector that estimates and the enhancement layer motion vector that estimates and the bit stream of texture information.
According to a further aspect in the invention, provide a kind of video encoder, having comprised: exercise estimator, the base layer motion vector and the enhancement layer motion vector of each piece in the estimation frame of video; Motion compensator uses the time redundancy in the enhancement layer motion vector removal frame of video; Space transformer, spatial alternation are removed the frame of video of time redundancy therein; Quantizer, the frame of video that quantizes spatial alternation is to obtain texture information; The motion vector selector is selected one of the base layer motion vector that estimates of each piece and enhancement layer motion vector of estimating; With the bit stream generator, generation is included as the motion vector of each piece selection and the bit stream of texture information.
According to a further aspect in the invention, provide a kind of video encoder, having comprised: exercise estimator, the base layer motion vector and the enhancement layer motion vector of each piece in the estimation frame of video; Motion compensator uses the time redundancy in the enhancement layer motion vector removal frame of video; Space transformer, spatial alternation are removed the frame of video of time redundancy therein; Quantizer, the frame of video that quantizes spatial alternation is to obtain texture information; With the bit stream generator, produce the base layer motion vector that estimates comprise each piece, as the residual motion vector of the difference of base layer motion vector that estimates and the enhancement layer motion vector that estimates and the bit stream of texture information.
According to a further aspect in the invention, a kind of pre-coding/pre-decoding method is provided, has comprised: received the base layer motion vector comprise each piece, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and the bit stream by texture information that the frame of video coding is obtained; With at least a portion of deleting section residual motion vector.
According to a further aspect in the invention, provide a kind of video encoding/decoding method, having comprised: explained incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and enhancement layer motion vector; The recanalization base layer motion vector; Texture information is carried out re-quantization and inverse spatial transform to obtain to have removed therein the frame of time redundancy; By the base layer motion vector of recanalization and enhancement layer motion vector the frame of removing time redundancy is therein carried out contrary motion compensation with using.
According to a further aspect in the invention, provide a kind of video encoding/decoding method, having comprised: explained incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and residual motion vector; For having the two each piece of base layer motion vector and residual motion vector, base layer motion vector and residual motion vector is synthetic, and obtain synthetic motion vector; Texture information is carried out re-quantization and inverse spatial transform, and obtain to have removed therein the frame of time redundancy; With the base layer motion vector that uses synthetic motion vector and synthesize the frame of removing time redundancy is therein carried out contrary motion compensation.
According to a further aspect in the invention, provide a kind of Video Decoder, having comprised: bitstream interpreter, explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and enhancement layer motion vector; Motion vector recanalization device, the recanalization base layer motion vector; Inverse quantizer is carried out re-quantization to texture information; Inverse spatial transformer is carried out inverse spatial transform to obtain to have removed therein the frame of time redundancy to the texture information of re-quantization; With contrary motion compensator, use the base layer motion vector of recanalization and enhancement layer motion vector that the frame of removing time redundancy is therein carried out contrary motion compensation, and rebuild frame of video.
According to a further aspect in the invention, provide a kind of Video Decoder, having comprised: bitstream interpreter, explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and residual motion vector; The motion vector synthesizer, for having the two each piece of base layer motion vector and residual motion vector that base layer motion vector and residual motion vector is synthetic, and obtain synthetic motion vector; Inverse quantizer is carried out re-quantization to texture information; Inverse spatial transformer is carried out inverse spatial transform to the texture information of re-quantization, and obtains to have removed therein the frame of time redundancy; With contrary motion compensator, use synthetic motion vector and not synthetic base layer motion vector that the frame of removing time redundancy is therein carried out contrary motion compensation.
Description of drawings
By the detailed description of its exemplary embodiment being carried out below in conjunction with accompanying drawing, above and other aspect of the present invention will become apparent, wherein:
Fig. 1 is the block diagram of conventional video encoder;
Fig. 2 represents conventional motion estimation process and temporal mode;
Fig. 3 is the block diagram of the video encoder of first exemplary embodiment according to the present invention;
Figure 4 and 5 are respectively the block diagrams of the video encoder of the second and the 3rd exemplary embodiment according to the present invention;
Fig. 6 represents motion estimation process according to an exemplary embodiment of the present invention;
Fig. 7 represents block mode according to an exemplary embodiment of the present invention;
Fig. 8 represents to have according to an exemplary embodiment of the present invention the example of the frame of different enhancement layer percentage;
Fig. 9 is the block diagram of the Video Decoder of first exemplary embodiment according to the present invention;
Figure 10 and 11 is respectively the block diagram of the Video Decoder of the second and the 3rd exemplary embodiment according to the present invention;
Figure 12 represents video traffic environment according to an exemplary embodiment of the present invention;
Figure 13 represents the structure of bit stream according to an exemplary embodiment of the present invention; With
Figure 14 is the curve chart of expression variation of video quality when using enhancement layer motion vector and base layer motion vector.
Embodiment
By with reference to following detailed description and accompanying drawing to exemplary embodiment, the present invention and realize that its method can be easier to understand.Yet the present invention can realize with a lot of different forms, and should not be construed as limited to the exemplary embodiment of setting forth here.More rightly, provide these exemplary embodiments so that the disclosure will be thorough and complete and fully design of the present invention be conveyed to those skilled in the art, and the present invention will only be defined by the following claims.In whole specification, identical label is represented same parts.
Now with reference to accompanying drawing the present invention is described more fully, exemplary embodiment of the present invention is represented in the accompanying drawings.
The invention provides to adjustment and distribute to the Video Coding Scheme that the amount of bits of motion vector (movable information) designs, and can be applied to using original video frame as the open loop video coding of reference frame and use reconstruction frames as the closed loop video coding of reference frame the two.Different with the open loop video encoder, because the closed loop video coding uses by quantized transform coefficients being carried out reconstruction frames that re-quantization, inverse transformation and motion compensation obtain as the reference frame, so the closed loop video encoder comprises several parts that are used for video decode, such as inverse quantizer and inverse spatial transformer.Describe the present invention although consult and use the exemplary embodiment of open loop scalable video, also can use the closed loop video coding.
Fig. 3 is the block diagram of the video encoder 300 of first exemplary embodiment according to the present invention.
With reference to Fig. 3, the video encoder 300 of first exemplary embodiment comprises according to the present invention: exercise estimator 310, motion compensator 320, space transformer 330, quantizer 340, bit stream generator 350, motion vector selector 360 and movable information encoder 370.
Exercise estimator 310 is estimated the motion between the piece in each piece in the present frames and the piece in the reference frame corresponding with piece in this present frame or two reference frames.Displacement between each piece in the present frame and the position of the relevant block in the reference frame is defined as motion vector.
Owing to find the motion estimation process of motion vector to need a large amount of calculating, so studied the amount of calculation of multiple technologies to reduce to be used to estimate to move.Three step search have been designed or the search of two dimension (2D) logarithm reduces amount of calculation with the search point that is used for each estimation of motion vectors by minimizing.Self adaptation/prediction search is such method, and by this method, the motion vector of the piece in the motion-vector prediction present frame of the piece from previous frame is to reduce the required amount of calculation of estimation.HVSBM is such algorithm, and in this algorithm, the frame with original resolution, and is used to find at the motion vector that lowest resolution finds and has more and more higher resolution motion vector obtaining low-resolution frames by down-sampling.The other method that reduces the required amount of calculation of estimation is to replace the function of the cost of computing block coupling with simple function.
Exercise estimator 310 in this exemplary embodiment is carried out process that finds base layer motion vector and the process that finds enhancement layer motion vector.That is to say that exercise estimator 310 finds base layer motion vector, readjust base layer motion vector then to find enhancement layer motion vector.Find the process of base layer motion vector and enhancement layer motion vector to carry out by various motion estimation algorithms.In this exemplary embodiment, use HVSBM to carry out the process that finds base layer motion vector or find base layer motion vector and the process of enhancement layer motion vector, this is because the motion vector that uses HVSBM to obtain has the consistent characteristic of characteristic with the motion vector of adjacent block.In addition, in than little region of search, the region of search that wherein obtains base layer motion vector, find enhancement layer motion vector.In other words, obtain enhancement layer motion vector by readjusting the base layer motion vector that has estimated.
Motion compensator 320 uses the motion compensation (hereinafter referred to as " basic unit's motion compensation ") of base layer motion vector to obtain order information by carrying out dividually with the motion compensation (hereinafter referred to as " enhancement layer motion compensation ") of using enhancement layer motion vector.Then, motion compensator 320 generals offer space transformer 330 by the frame of enhancement layer motion compensation removal time redundancy therein.
In scalable video, use various algorithms to remove time redundancy such as motion compensated temporal filter (MCTF).Although in traditional MCTF, use the Haar filter, use 5/3 filter recently widely.MCTF is performed on image sets (GOP) basis, and comprises: use the result of estimation to produce predictive frame; Acquisition is as the residual frame (high pass subband) of the difference of present frame and predictive frame; Upgrade remaining primitive frame or low pass subband with the use residual frame.As the result who carries out this process repeatedly, in the frame of forming GOP, removed time redundancy to obtain a low pass subband and a plurality of high pass subband.
The spatial redundancy in the frame of having removed time redundancy is removed in space transformer 330 usage space conversion, and creates conversion coefficient.Use DCT or wavelet transformation to carry out spatial alternation.Video encoder 300 can use wavelet transformation to produce the bit stream with spatial scalability.On the other hand, a plurality of layers the video encoder 300 with different resolution can use DCT to remove spatial redundancy in the frame of removing time redundancy, has the bit stream of spatial scalability with generation.
Quantizer 340 is to make the mode of distortion minimization come quantization transform coefficient at given bit rate.Use is carried out the quantification that is used for scalable video such as following known embedded quantization algorithm: the set of embedded zero-tree wavelet (EZW), hierarchical tree is divided (SPIHT), embedded zero block encoding (EZBC) and is had to optimize and deletes the embedded block encoding (EBCOT) that cuts.Quantized transform coefficients (texture information) is inserted in the bit stream after being subjected to scanning, flexible and variable length code.Simultaneously, this bit stream comprises texture information and movable information.For movable information is inserted in the bit stream, video encoder 300 comprises motion vector selector 360 and movable information encoder 370.
Motion vector selector 360 is selected any of base layer motion vector and enhancement layer motion vector to each piece.More particularly, selecting enhancement layer motion vector from having ropy of maximum visual to the order of the piece with minimum visual quality difference, described visual quality uses basic unit's motion compensation and enhancement layer motion compensation and acquisition when being removed when the time is redundant.For example, the degree that improves when visual quality reduces with the order of piece 1,2,3,4,5,6,7 and 8 and a strong layer motion compensation once when only can be used for three pieces, 360 pairs of pieces of motion vector selector 1 to 3 are selected enhancement layer motion vector, and piece 4 to 8 is selected base layer motion vector.The movable information of selecting (base layer motion vector and enhancement layer motion vector) is provided for movable information encoder 370.Therefore, the texture information that is included in the bit stream is the quantized transform coefficients that obtains from enhancement layer motion compensation, spatial alternation and quantification, and the movable information that is included in wherein is the enhancement layer motion vector of piece 1 to 3 and the base layer motion vector of piece 4 to 8.
Motion vector selector 360 receives the information (hereinafter referred to as " order information ") of the order of the degree reduction that improves about video quality from motion compensator 320.The percentage of the enhancement layer motion vector of being selected by motion vector selector 360 can be determined automatically by the manual input of user or according to bit rate.When movable information was merged according to bit rate, 360 pairs of high bit rates of motion vector selector were selected the percentage of high enhancement layer motion vector, and low bit rate were selected the percentage of low enhancement layer motion vector.
Movable information encoder 370 uses arithmetic coding or variable length code that movable information is encoded.The movable information of coding is inserted in the bit stream.When the motion vector that comprises in the movable information has consistency, the code efficiency height of movable information.In this exemplary embodiment, in order to obtain consistent motion vector, exercise estimator 310 uses the HVSBM algorithm to come estimated motion vector (base layer motion vector and enhancement layer motion vector).
Bit stream generator 350, generation comprise the bit stream of the movable information of texture information and coding.Although it is base layer motion vector or enhancement layer motion vector that above description is included in the motion vector of each piece in the movable information of coding, this motion vector can comprise and obtains the required base layer motion vector of enhancement layer motion vector and residual motion vector but not enhancement layer motion vector.The video encoder that this is equally applicable to show in Fig. 4.
Fig. 4 is the block diagram of the video encoder 400 of second exemplary embodiment according to the present invention.
With reference to Fig. 4, the exercise estimator 410 in the video encoder 400, motion compensator 420, space transformer 430 and quantizer 440 have and they essentially identical functions of corresponding component in the video encoder 300 of Fig. 3.
Yet motion vector selector 460, movable information encoder 470 and bit stream generator 450 are worked in the mode different slightly with their corresponding components in the video encoder 300 of Fig. 3.
Motion vector selector 460 produces polytype exercise data, and each of this polytype exercise data has different base layer motion vectors and enhancement layer motion vector percentage.For example, motion vector selector 460 can produce six type of motion data altogether.First kind exercise data comprises the enhancement layer motion vector of all pieces.The second kinds of athletic data comprise 80 percent enhancement layer motion vector and 20 percent base layer motion vector.The 3rd kinds of athletic data comprise 60 percent enhancement layer motion vector and 40 percent base layer motion vector.The 4th kinds of athletic data comprise 40 percent enhancement layer motion vector and 60 percent base layer motion vector.The 5th kinds of athletic data comprise 20 percent enhancement layer motion vector and 80 percent base layer motion vector.The 6th kinds of athletic data comprise the base layer motion vector of all pieces.These six type of motion data all are inserted in the bit stream.Simultaneously, Video Decoder receives the bit stream by pre decoder 480 pre decodings, comes the reconstruction video frame with the exercise data that uses a kind of type.
When the number of types of the exercise data of being created by motion vector selector 460 increased, when the size of bit stream increased, the motion scalability of bit stream increased.On the other hand, when the number of types of exercise data reduced, when the size of bit stream reduced, the motion scalability of bit stream reduced.The exercise data of each type can comprise with above example in the enhancement layer motion vector of different percentage.For example, when motion vector selector 460 produced six type of motion data, the percentage of the enhancement layer motion vector that comprises in these six type of motion data can be respectively 100,70,40,20,10 and 0.
Movable information encoder 470 uses arithmetic coding or variable length code to this polytype exercise data coding, to reduce data volume.
Bit stream generator 450 produces the bit stream of the exercise data that comprises texture information and coding.
Pre decoder 480 is deleted and is cut (truncate) exercise data with one type the excluded coding of exercise data, to be used to send to decoder.For example, when being used for that bit stream sent to the smaller bandwidth of decoder, pre decoder 480 is deleted the motion vector data of the excluded coding of exercise data that cuts the enhancement layer motion vector that will comprise lowest percentage (for example, 0%).On the contrary, when being used for that bit stream is sent to the bandwidth of decoder when very wide, pre decoder 480 is deleted the motion vector data of the excluded coding of exercise data that cuts the enhancement layer motion vector that will comprise the highest percentage (for example, 100%).In a similar fashion, pre decoder 480 is deleted the exercise data that cuts one type the excluded coding of exercise data will suitably selecting according to bit rate.
Fig. 5 is the block diagram of the video encoder 500 of the 3rd exemplary embodiment according to the present invention.
With reference to Fig. 5, the exercise estimator 510 in the video encoder 500, motion compensator 520, space transformer 530, quantizer 540, bit stream generator 550 and movable information encoder 570 have and they essentially identical functions of corresponding component in the video encoder 300 of Fig. 3.
Different with the video encoder 300 of Fig. 3, video encoder 500 does not comprise the motion vector selector.Therefore, 570 pairs of movable information encoders comprise the two information coding of the base layer motion vector of each piece and enhancement layer motion vector.The movable information (base layer motion vector and enhancement layer motion vector) of coding is inserted in the bit stream.
Bit stream generator 550 produces and comprises texture information, the movable information of coding and the bit stream of order information.
Pre decoder 580 is deleted the movable information that cuts coding from the enhancement layer motion vector that shows the piece that minimum quality improves.For example, when bit rate was very low, pre decoder 580 was deleted the enhancement layer motion vector that cuts all codings, and when bit rate was sufficient, pre decoder 580 kept enhancement layer motion vector.
Fig. 6 represents motion estimation process according to an exemplary embodiment of the present invention,
In Fig. 6, show base layer motion vector, enhancement layer motion vector and residual motion vector.At first, obtain base layer motion vector and enhancement layer motion vector from basic unit's motion search and enhancement layer motion search respectively, the residual motion vector is the poor of enhancement layer motion vector and base layer motion vector.
Piece 610 is the pieces in the present frame, and piece 620 is pieces corresponding with piece 610, and piece 630 is the pieces that obtain from basic unit's motion search.In traditional motion estimation process, directly find the piece 620 corresponding with piece 610.And in exemplary embodiment of the present invention, after use basic unit motion search finds piece 630, use the enhancement layer motion search to find piece 620.To be described in the piece matching scheme that uses in the exemplary embodiment of the present invention now.
Piece to the minimized position of cost of the block encoding in the present frame is confirmed as and the corresponding piece of piece in the present frame.E (k, l) and B (k, l) the 1st piece bit of when coming k block encoding in present frame distributing to texture and motion vector of expression in the region of search of use reference frame respectively, cost C (k l) is defined by equation (1):
C(k,l)=E(k,l)+λB(k,l) ...(1)
Here, λ is Lagrangian coefficient, is used for Control Allocation to the balance between the bit of motion vector and texture.When λ increased, the bit number of distributing to texture increased.When λ reduced, the bit number of distributing to motion vector increased.When bit under the situation at low-down bit rate enough is not assigned to motion vector, make λ very big, thereby bit mainly is assigned to texture.
In order to obtain base layer motion vector, find to make cost C (k, l) minimized value 1, and piece 630 in calculating and the value 1 corresponding reference frame and the displacement between the piece 610 in the present frame.After in this way determining base layer motion vector, use equation (1) in the enhancement layer region of search, to find piece 620.The enhancement layer region of search can be narrower than the basic unit region of search significantly, so that the difference of base layer motion vector and enhancement layer motion vector minimizes.Similarly, find to make minimized 620 of cost, and the difference of piece 620 and the piece 630 that uses basic unit's motion search to find is defined as enhancement layer motion vector.Basic unit's motion search uses the λ greater than the enhancement layer motion search, so that a spot of bit can be assigned to base layer motion vector.Therefore, for low-down bit rate, texture and basic unit's movable information are comprised in the bit stream, minimize so that distribute to the bit number of motion vector.
Can use HVSBM to carry out basic unit's motion search and enhancement layer motion search.Provide the HVSBM of consistent motion vector field to reduce the overall bit rate of motion vector.In addition, HVSBM needs low computational effort, and realizes motion scalability by the enhancement layer region of search is defined in little zone.Actual experimental result shows: no matter the size of enhancement layer region of search, Y-PSNR (PSNR) is almost constant.
The bit stream that is produced by the video encoder 300 of Fig. 3 comprises the exercise data of single type, and the exercise data of this single type comprises the base layer motion vector or the enhancement layer motion vector of each piece.The bit stream that is produced by the video encoder 400 of Fig. 4 comprises polytype exercise data, and each of this polytype exercise data comprises the base layer motion vector or the enhancement layer motion vector of each piece.This exercise data also has the percentage of different enhancement layer motion vectors.Therefore, this bit stream is by pre decoding, and deleted under with the excluded situation of special exercise data and cut, to be used to send to Video Decoder.The bit stream that is produced by the video encoder 500 of Fig. 5 comprises the exercise data of single type, the exercise data of this single type comprise the base layer motion vector of each piece and residual motion vector the two.Therefore, this bit stream according to bit rate by pre decoding, so that the two sends to Video Decoder with the base layer motion vector of the only base layer motion vector of some pieces and rest block and residual motion vector.
The video encoder 300 of Fig. 3 can comprise the motion vector synthesizer to replace motion vector selector 360, and this motion vector synthesizer is used for base layer motion vector and residual motion vector synthetic.In this case, base layer motion vector and residual motion vector are provided for the motion vector synthesizer, and the movable information that comprises base layer motion vector and enhancement layer motion vector is provided for movable information encoder 370.Each of enhancement layer motion vector is by synthetic and obtained with base layer motion vector and residual motion vector.The video encoder 400 of Fig. 4 can comprise also that the motion vector synthesizer is to replace motion vector selector 460.
Simultaneously, although the bit stream that above description is produced by the video encoder 500 of Fig. 5 comprise the base layer motion vector of each piece and residual motion vector the two, enhancement layer motion vector can be inserted in this bit stream to replace the residual motion vector.In this case, pre decoder 580 is optionally deleted base layer motion vector or the enhancement layer motion vector that cuts each piece according to bit rate and order information.
Fig. 7 represents block mode according to an exemplary embodiment of the present invention.With reference to Fig. 7, when introducing the notion of block mode, the motion scalability that uses little enhancement layer region of search to realize as mentioned above is reinforced.
In pattern 0, be that unit carries out motion-vector search with 16 * 16.In pattern 1, pattern 2, mode 3 and pattern 4, carry out motion-vector search with 8 * 16,16 * 8 and 4 * 4 sub-pieces respectively.
In this exemplary embodiment, basic unit's block mode is one of pattern 0, pattern 1, pattern 2 and mode 3, and the enhancement layer block pattern is one of pattern 0, pattern 1, pattern 2, mode 3 and pattern 4.When basic unit's block mode was pattern 0, the enhancement layer block pattern was selected from pattern 0, pattern 1, pattern 2, mode 3 and pattern 4.When basic unit's block mode was pattern 1, the enhancement layer block pattern was selected from pattern 1, mode 3 and pattern 4.When basic unit's block mode was pattern 2 and mode 3 respectively, the enhancement layer block pattern was selected 4 and in mode 3 and the pattern 4 from pattern 2 to pattern respectively.When basic unit's block mode was pattern 1, the enhancement layer block pattern can not be a pattern 2, because pattern 1 and pattern 2 are respectively horizontal pattern and vertical mode.
Because basic unit's motion search uses the λ greater than the enhancement layer motion search as mentioned above, even, still be subjected to bigger loss (penalty) in the basic unit so distribute to the bit number that equals to distribute to the motion vector (base layer motion vector and enhancement layer motion vector) that during the enhancement layer motion search, estimates at the bit number of the motion vector that estimates during basic unit's motion search (base layer motion vector).Therefore, in the experiment of reality, except special circumstances, pattern 0 is confirmed as basic unit's block mode.On the other hand, because enhancement layer uses little λ, so the loss of bit number of distributing to motion vector is less than basic unit.For this reason, the enhancement layer block pattern has the piece that divides again more subtly usually.Although Fig. 7 shows five block modes, the quantity of obtainable block mode can be greater than or less than five.
According to exemplary embodiment of the present invention, by the frame that uses enhancement layer motion vector to remove time redundancy therein being carried out spatial alternation and being quantized to obtain to be included in texture image in the bit stream.Therefore, when the motion vector of some pieces under the situation at low bit rate was base layer motion vector, the motion mismatch may take place.The motion mismatch is introduced into, and uses base layer motion vector because use enhancement layer motion vector during encoding during decoding, and this causes the reduction of coding efficiency (for example, visual quality, compression efficiency etc.).
Therefore, minimize in order to make the motion mismatch, the present invention proposes to be used for determining the enhancement layer motion vector of each piece or the algorithm of base layer motion vector.Degree E by the mismatch of using base layer motion vector to cause at decoder provides as follows by equation (2):
The E=∑ | O m-O b| ... (2) wherein, O mAnd O bThe frame that is to use enhancement layer motion vector and base layer motion vector to rebuild respectively.O mAnd O bBe defined as follows by equation (3):
O m=P m+H m
O+b=P b+ H m... (3) wherein, P mAnd H mBe to use predictive frame and the residual frame of using enhancement layer motion vector to obtain respectively, and P bBe to use the frame of base layer motion vector prediction.
Suppose and in video coding, do not quantize loss, O mCan be defined as follows by equation (4):
O m=P b+H b ...(4)
Wherein, H bThe residual frame that is to use base layer motion vector to obtain.
To also rearrange in equation (3) to (4) the substitution equations (2), it is as follows to obtain equation (5):
E=∑|O m-O b|=∑|P m-P b|=∑|H m-H b| ...(5)
Such as equation (5) definition, the degree E of mismatch is by the difference of the frame that uses the prediction of enhancement layer motion vector and base layer motion vector or use enhancement layer motion vector and the difference of the residual frame that base layer motion vector obtains is determined.
With reference to Fig. 3 to Fig. 5, obtain predictive frame and residual frame by motion compensator 320, motion compensator 420 or motion compensator 520.That is to say that motion compensator 320, motion compensator 420 or motion compensator 520 receive base layer motion vector and enhancement layer motion vector from exercise estimator 310, exercise estimator 410 or exercise estimator 510, to produce predictive frame P mAnd P bAnd residual frame H mAnd H b
In the present invention, can use equation (5) to determine the importance of each piece.That is to say, calculate to use the enhancement layer motion compensation each piece coding and use poor between the coding of each piece of basic unit's motion compensation, and determine the order of the importance of piece according to the degree of this difference.For example, can determine the order of importance by the difference of the residual block of using enhancement layer motion compensation and basic unit's motion compensation to obtain (piece in the present frame and the piece in the predictive frame poor).That is to say that when existing than big difference, the difference between the coding of each piece of the coding of each piece of use enhancement layer motion compensation and the motion compensation of use basic unit also is considered to bigger between residual block.The order of the importance of piece can be by the motion vector selector but not exercise estimator calculate.
Motion vector selector 360 among Fig. 3 or the motion vector selector 460 among Fig. 4 are selected enhancement layer motion vector with the order of importance.That is to say that the enhancement layer motion vector priority allocation is given the piece with big mistake.Simultaneously, the bit stream that is produced by the video encoder of the Fig. 5 that does not comprise the motion vector selector comprises base layer motion vector and the residual motion vector and the order information of all pieces.Use order information, pre decoder 580 is deleted from the residual motion vector with lowest importance as required according to bit rate and is cut movable information.
Fig. 8 represents that the percentage of enhancement layer wherein is respectively the example of 0% and 50% frame.
With reference to Fig. 8, when the percentage of enhancement layer was 0%, all texture informations used the enhancement layer motion compensation and are produced, and all pieces are subjected to the motion compensation of contrary basic unit during video coding.When the percentage of enhancement layer was 50%, all texture information used the enhancement layer motion compensation and is produced, and was subjected to the motion compensation of contrary basic unit and 50% piece is subjected to contrary enhancement layer motion compensation and remain 50% piece.
In piece, indicator collet pattern numbering.As shown in FIG. 8, for same, basic unit's block mode and enhancement layer block pattern can change.When basic unit's block mode was different from the enhancement layer block pattern, the enhancement layer block pattern was used for being subjected to the piece of contrary enhancement layer motion compensation during decoding, and basic unit's block mode is used to be subjected to the piece of contrary basic unit motion compensation.
Now with reference to Fig. 9 to 11 Video Decoder that the frame of video of encoding based on the scalable video of MCTF is used in reconstruction is described.Fig. 9 represents to be used for bit stream that the video encoder 300 by Fig. 3 is produced or the Video Decoder 900 of the bit stream decoding of the pre decoding that produced by the pre decoder 480 that Fig. 4 shows.Figure 10 and 11 demonstrations are used for the Video Decoder to the bit stream decoding of the pre decoding of pre decoder 580 generations that shown by Fig. 5.
Fig. 9 is the block diagram of the Video Decoder 900 of first exemplary embodiment according to the present invention.
With reference to Fig. 9, Video Decoder 900 comprises: bitstream interpreter 910, inverse quantizer 920, inverse spatial transformer 930, contrary motion compensator 940, movable information decoder 950 and motion vector recanalization device 960.
Bitstream interpreter 910 obtains the movable information of texture information and coding from incoming bit stream.The texture information of view data that comprises the frame of video of coding is provided for inverse quantizer 920, and the movable information that comprises the coding of the base layer motion vector of each piece or enhancement layer motion vector is provided for movable information decoder 950.
Inverse quantizer 920 re-quantization texture informations are to obtain conversion coefficient.The conversion coefficient that obtains is sent to inverse spatial transformer 930.
930 pairs of conversion coefficients of inverse spatial transformer are carried out inverse spatial transform, with single low pass subband and a plurality of high pass subband that obtains each GOP.
Contrary motion compensator 940 receives low pass subband and a plurality of high pass subband of each GOP, to use one or more high pass subbands to upgrade low pass subband and to use the low pass subband of this renewal to produce predictive frame.Then, contrary motion compensator 940 is added to the high pass subband with this predictive frame, rebuilds low pass subband thus.Contrary motion compensator 940 upgrades the low pass subband of this renewal and the low pass subband of reconstruction once more, uses the low pass subband of upgrading to produce two predictive frames, and rebuilds two low pass subband by respectively these two predictive frames being added to two high pass subbands.Contrary motion compensator 940 is carried out above process repeatedly to rebuild the frame of video of forming GOP.To produce motion vector that operating period uses be from movable information (base layer motion vector of each piece or the enhancement layer motion vector) acquisition that obtained by the movable information decoder 950 to the movable information decoding of coding upgrading operation and predictive frame.Movable information comprises base layer motion vector and enhancement layer motion vector as a result.Base layer motion vector is provided for motion vector recanalization device 960, and this motion vector recanalization device 960 uses the enhancement layer motion vector of adjacent block to come the recanalization base layer motion vector subsequently.On the other hand, motion vector recanalization device 960 can use the predictive frame that produces during contrary motion compensation to come the recanalization base layer motion vector as a reference.The base layer motion vector of enhancement layer motion vector and recanalization is provided for contrary motion compensator 940, to be used for upgrading operation and predictive frame generation operation.
Figure 10 is the block diagram of the Video Decoder 1000 of second exemplary embodiment according to the present invention.
With reference to Figure 10, Video Decoder 1000 comprises: bitstream interpreter 1010, inverse quantizer 1020, inverse spatial transformer 1030, contrary motion compensator 1040, movable information decoder 1050 and motion vector synthesizer 1070.
Bitstream interpreter 1010 obtains the movable information of texture information and coding from incoming bit stream.The texture information of view data that comprises the frame of video of coding is provided for inverse quantizer 1020, and the movable information that comprises the coding of motion vector is provided for movable information decoder 1050.
Inverse quantizer 1020 re-quantization texture informations are to obtain conversion coefficient, and the conversion coefficient of this acquisition is sent to inverse spatial transformer 1030 subsequently.1030 pairs of conversion coefficients of inverse spatial transformer are carried out inverse spatial transform, with single low pass subband and a plurality of high pass subband that obtains each GOP.Contrary motion compensator 1040 receives low pass subband and a plurality of high pass subband of each GOP, with the reconstruction video frame.
The movable information decoding of 1050 pairs of codings of movable information decoder is to obtain movable information.This movable information comprises the base layer motion vector of some pieces and the base layer motion vector and the residual motion vector of rest block.The base layer motion vector of described rest block and residual motion vector are sent to motion vector synthesizer 1070.
Motion vector synthesizer 1070 is synthetic to obtain enhancement layer motion vector with base layer motion vector and residual motion vector, and this enhancement layer motion vector is provided for contrary motion compensator 1040 subsequently and produces operation to be used for renewal operation and predictive frame.
Figure 11 is the block diagram of the Video Decoder 1100 of the 3rd exemplary embodiment according to the present invention.
Video Decoder 1100 comprises: bitstream interpreter 1110, inverse quantizer 1120, inverse spatial transformer 1130, contrary motion compensator 1140, movable information decoder 1150, motion vector synthesizer 1170 and motion vector recanalization device 1160.The parts of Video Decoder 1100 have and they essentially identical functions of corresponding component in the video encoder 1000 of Figure 10.Different with the video encoder 1000 of Figure 10, Video Decoder 1100 also comprises motion vector recanalization device 1160.
Motion vector recanalization device 1160 uses the synthetic motion vector of adjacent block to come the recanalization base layer motion vector.On the other hand, motion vector recanalization device 1160 can use the predictive frame that obtains during contrary motion compensation to come the recanalization base layer motion vector as a reference.The motion vector that should be synthetic and the motion vector of recanalization are provided for contrary motion compensator 1140, produce operation to be used for renewal operation and predictive frame.
Figure 12 represents video traffic environment according to an exemplary embodiment of the present invention.
With reference to Figure 12, video encoder 1210 uses scalable video that frame of video is encoded into bit stream.With reference to Figure 13 the structure of bit stream is according to an exemplary embodiment of the present invention described after a while.
Pre decoder 1220 is deleted a part (pre decoding) of cutting this bit stream according to the bandwidth on the network 1230.For example, when the bandwidth of network 1230 was sufficient, the user asked high-quality video.Pre decoder 1220 is deleted a small amount of bit that cuts in the bit stream, does not perhaps delete and cuts bit.On the other hand, when obtainable bandwidth was inadequate, pre decoder 1220 was deleted a large amount of bits that cut in the bit stream.
Video Decoder 1240 is by the bit stream of network 1230 reception pre decodings, with the reconstruction video frame.
Figure 13 represents the structure of bit stream according to an exemplary embodiment of the present invention
With reference to Figure 13, this bit stream comprises: 1310, motion vector fields 1320 and texture information field 1330.
1310 can comprise sequence head, GOP head, frame head and the head of specified sequence, GOP, frame and sheet (slice) information necessary respectively.
Motion vector fields 1320 comprises: order information field 1321, base layer motion vector field 1322 and enhancement layer motion vector field 1323.
Order information field 1321 comprises the information about the order of the piece of the degree reduction of video quality raising.For example, when enhancement layer motion vector is used to degree that piece 1 to 6 and visual quality improve when reducing with the order of piece 1,4,2,3,5 and 6, order information is appointed as 1,4,2,3,5,6 with order.Therefore, during pre decoding, the order (6,5,3,2,4,1) of the piece that the degree that improves with visual quality increases is deleted and is cut enhancement layer motion vector.
Base layer motion vector field 1322 comprises the information about the motion vector that obtains when a small amount of bit is assigned to motion vector.
Enhancement layer motion vector field 1323 comprises the information about the motion vector that obtains when a large amount of bits are assigned to motion vector.
Pre decoder is optionally deleted and is cut specific base layer motion vector or enhancement layer motion vector.That is to say that on the one hand, when having determined enhancement layer motion vector for this piece, pre decoder is deleted the base layer motion vector that cuts in the bit stream.On the other hand, when having determined base layer motion vector for this piece, pre decoder is deleted the enhancement layer motion vector that cuts in the bit stream.
On the other hand, motion vector fields 1320 can comprise base layer motion vector field 1322 and residual motion vector field.In this case, when being specific when having determined base layer motion vector, pre decoder is deleted the residual motion vector that cuts in the bit stream.On the other hand, when having determined enhancement layer motion vector for this piece, pre decoder is not deleted and is cut base layer motion vector.That is to say that Video Decoder uses the base layer motion vector of this piece and residual motion vector to rebuild enhancement layer motion vector to be used for contrary motion compensation.
Texture information field 1330 comprises: Y component field 1331, the texture information of appointment Y component; U component field 1332, the texture information of appointment U component; With V component field 1333, specify the texture information of V component.
The process of the bit rate reduce to use the scalable video bitstream encoded is described now with reference to Figure 14.
Figure 14 is the curve chart of expression variation of video quality when using enhancement layer motion vector and base layer motion vector.
As shown in Figure 14, for high bit rate, when using enhancement layer motion vector, be higher than when the use base layer motion vector quality by the video of decoder reconstructs by the quality of the video of decoder reconstructs.Yet when bit rate was extremely low, the quality of the video of rebuilding when using base layer motion vector was higher than the quality of the video of rebuilding when using enhancement layer motion vector.
Therefore, when the request that receives bit stream with the bit rate that is higher than reference point, pre decoder provides all enhancement layer motion vectors when deleting the bit that cuts unnecessary texture.On the other hand, when the request that receives bit stream with the bit rate that is lower than reference point, pre decoder is deleted the bit that cuts texture and part or all of enhancement layer motion vector.
This reference point can obtain from various video sequences by experiment.
Simultaneously, when bit rate was extremely low, pre decoder can be deleted and cut all motion vectors (base layer motion vector and enhancement layer motion vector).
As mentioned above, can realize providing the video coding of motion scalability by video coding according to the present invention and coding/decoding method and video encoder and decoder.Different with the conventional video coding, by make the visual quality that the bit number that comprises in the movable information minimizes provides raising under the situation of low-down bit rate, described conventional video coding is owing to be difficult to adjust the reduction that the bit number that comprises in the movable information is subjected to visual quality under the situation of low-down bit rate according to video coding of the present invention and coding/decoding method.
Though represented particularly with reference to its exemplary embodiment and described the present invention, but it should be understood by one skilled in the art that, under the situation that does not break away from the spirit and scope of the present invention that limit by following claim, can carry out various changes on form and the details to it.

Claims (39)

1, a kind of method for video coding comprises:
Estimate the base layer motion vector and the enhancement layer motion vector of each piece in the frame of video;
Use the time redundancy in the enhancement layer motion vector removal frame of video;
Spatial alternation is removed the frame of video of time redundancy therein, and quantizes by the frame of video of spatial alternation, to obtain texture information;
Select one of the base layer motion vector of each piece and enhancement layer motion vector; With
Generation is included as the base layer motion vector of each piece selection or the bit stream of enhancement layer motion vector and texture information.
2, the method for claim 1, wherein use layering variable size block coupling to estimate base layer motion vector.
3, the method for claim 1, wherein estimate enhancement layer motion vector by the recanalization base layer motion vector.
4, the method for claim 1, wherein each of base layer motion vector and enhancement layer motion vector has one of a plurality of block modes.
5, the method for claim 1, wherein select the step of one of base layer motion vector and enhancement layer motion vector to comprise:
Calculate to use poor between the residual block that base layer motion vector and enhancement layer motion vector obtain;
Determine the significance sequence of piece according to this difference; With
Come the piece of predetermined percentage is selected enhancement layer motion vector and the piece of residue percentage is selected base layer motion vector with order from piece to piece with minimum importance with maximum importance.
6, a kind of method for video coding comprises:
Estimate the base layer motion vector and the enhancement layer motion vector of each piece in the frame of video;
Use the time redundancy in the enhancement layer motion vector removal frame of video;
Spatial alternation is removed the frame of video of time redundancy therein, and quantizes by the frame of video of spatial alternation, to obtain texture information; With
Generation comprises the base layer motion vector of each piece, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and the bit stream of texture information.
7, method as claimed in claim 6 wherein, uses layering variable size block coupling to estimate base layer motion vector.
8, method as claimed in claim 6 wherein, is estimated enhancement layer motion vector by the recanalization base layer motion vector.
9, method as claimed in claim 6, wherein, each of base layer motion vector and enhancement layer motion vector has one of a plurality of block modes respectively.
10, method as claimed in claim 6 also comprises: calculate to use enhancement layer motion vector each piece coding and use poor between the coding of each piece of base layer motion vector, and determine the order of the importance of piece according to the degree of this difference,
Wherein, the order of described importance is comprised in the bit stream.
11, method as claimed in claim 10, wherein, the difference between the residual block that residual block that this difference obtains by the enhancement layer motion vector that calculate to use this piece and the base layer motion vector that uses this piece obtain obtains.
12, a kind of video encoder comprises:
Exercise estimator, the base layer motion vector and the enhancement layer motion vector of each piece in the estimation frame of video;
Motion compensator uses the time redundancy in the enhancement layer motion vector removal frame of video;
Space transformer, spatial alternation are removed the frame of video of time redundancy therein;
Quantizer quantizes by the frame of video of spatial alternation to obtain texture information;
The motion vector selector is selected one of the base layer motion vector of each piece and enhancement layer motion vector; With
Bit stream generator, generation are included as the base layer motion vector of each piece selection or the bit stream of enhancement layer motion vector and texture information.
13, video encoder as claimed in claim 12, wherein, exercise estimator use layering variable size block mates estimates base layer motion vector.
14, video encoder as claimed in claim 12, wherein, exercise estimator is estimated enhancement layer motion vector by the recanalization base layer motion vector.
15, video encoder as claimed in claim 12, wherein, exercise estimator is estimated the base layer motion vector and the enhancement layer motion vector of each piece with at least one of a plurality of block modes.
16, video encoder as claimed in claim 12, wherein, exercise estimator calculate to use poor between the residual block that base layer motion vector and enhancement layer motion vector obtain, and determine the significance sequence of piece according to this difference, and the motion vector selector comes the piece of predetermined percentage is selected enhancement layer motion vector and the piece of residue percentage is selected base layer motion vector with the order from piece with maximum importance to the piece with minimum importance.
17, a kind of video encoder comprises:
Exercise estimator, the base layer motion vector and the enhancement layer motion vector of each piece in the estimation frame of video;
Motion compensator uses the time redundancy in the enhancement layer motion vector removal frame of video;
Space transformer, spatial alternation are removed the frame of video of time redundancy therein;
Quantizer quantizes by the frame of video of spatial alternation to obtain texture information; With
The bit stream generator produces the base layer motion vector comprise each piece, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and the bit stream of texture information.
18, video encoder as claimed in claim 17, wherein, exercise estimator use layering variable size block mates estimates base layer motion vector.
19, video encoder as claimed in claim 17, wherein, exercise estimator is estimated enhancement layer motion vector by the recanalization base layer motion vector.
20, video encoder as claimed in claim 17, wherein, exercise estimator is estimated the base layer motion vector and the enhancement layer motion vector of each piece with at least one of a plurality of block modes.
21, video encoder as claimed in claim 17, wherein, exercise estimator calculate to use poor between the residual block that base layer motion vector and enhancement layer motion vector obtain, determine the significance sequence of piece according to this difference, and send described significance sequence to the bit stream generator, so that the bit stream generator is inserted into described significance sequence in the bit stream.
22, a kind of pre-coding/pre-decoding method comprises:
Reception comprises the base layer motion vector of each piece of frame of video, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and related with each piece in the frame of video bit stream that passes through texture information that the frame of video coding is obtained; With
Delete at least a portion of cutting the residual motion vector.
23, method as claimed in claim 22, wherein, this bit stream also comprises the significance sequence of piece,
Wherein, in the step of deleting at least a portion of cutting the residual motion vector, use the significance sequence of piece to come as a reference to delete section from residual motion vector with minimum importance.
24, method as claimed in claim 22, wherein, if the bit strearm rate of request when being lower than predetermined reference point, is then deleted at least a portion of cutting this residual motion vector.
25, a kind of video encoding/decoding method comprises:
Explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and enhancement layer motion vector;
The recanalization base layer motion vector;
Texture information is carried out re-quantization and inverse spatial transform to obtain to have removed therein the frame of time redundancy; With
Use and the frame of removing time redundancy is therein carried out contrary motion compensation by the base layer motion vector of recanalization and enhancement layer motion vector.
26, method as claimed in claim 25 wherein, uses the enhancement layer motion vector of adjacent block to come the recanalization base layer motion vector as a reference.
27, method as claimed in claim 25 wherein, uses the predictive frame that produces during contrary motion compensation to come the recanalization base layer motion vector.
28, a kind of video encoding/decoding method comprises:
Explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and residual motion vector;
To having the two each of a plurality of of base layer motion vector and residual motion vector, base layer motion vector and residual motion vector is synthetic, and obtain synthetic motion vector;
Texture information is carried out re-quantization and inverse spatial transform, and obtain to have removed therein the frame of time redundancy; With
Use synthetic motion vector and not synthetic base layer motion vector that the frame of removing time redundancy is therein carried out contrary motion compensation.
29, method as claimed in claim 28 also comprises this not synthetic base layer motion vector of recanalization,
Wherein, use synthetic motion vector and by the synthetic base layer motion vector of recanalization the frame of removing time redundancy therein having been carried out against motion compensation.
30, a kind of Video Decoder comprises:
Bitstream interpreter is explained incoming bit stream, and obtains texture information and the movable information that comprises base layer motion vector and enhancement layer motion vector;
Motion vector recanalization device, the recanalization base layer motion vector;
Inverse quantizer is carried out re-quantization to texture information;
Inverse spatial transformer is carried out inverse spatial transform to obtain to have removed therein the frame of time redundancy to the texture information of re-quantization; With
Contrary motion compensator uses and by the base layer motion vector of recanalization and enhancement layer motion vector the frame of removing time redundancy is therein carried out contrary motion compensation, and rebuilds frame of video.
31, decoder as claimed in claim 30, wherein, motion vector recanalization device uses the enhancement layer motion vector of adjacent block to come the recanalization base layer motion vector.
32, decoder as claimed in claim 30, wherein, motion vector recanalization device uses the predictive frame that is produced by contrary motion compensator to come the recanalization base layer motion vector.
33, a kind of Video Decoder comprises:
Bitstream interpreter is explained incoming bit stream, and obtains texture information and the movable information that comprises base layer motion vector and residual motion vector;
The motion vector synthesizer, for having the two each of a plurality of of base layer motion vector and residual motion vector that base layer motion vector and residual motion vector is synthetic, and obtain synthetic motion vector;
Inverse quantizer is carried out re-quantization to texture information;
Inverse spatial transformer is carried out inverse spatial transform to the texture information of re-quantization, and obtains to have removed therein the frame of time redundancy; With
Contrary motion compensator uses synthetic motion vector and not synthetic base layer motion vector that the frame of removing time redundancy is therein carried out contrary motion compensation.
34, decoder as claimed in claim 33 also comprises: the base layer motion vector that motion vector recanalization device, recanalization are not synthetic.
35, a kind of have a recording medium that is recorded in computer-readable program wherein, and this program is used to carry out method for video coding, and this method comprises:
Estimate the base layer motion vector and the enhancement layer motion vector of each piece in the frame of video;
Use the time redundancy in the enhancement layer motion vector removal frame of video;
Spatial alternation is removed the frame of video of time redundancy therein, and quantizes by the frame of video of spatial alternation, to obtain texture information;
Select one of the base layer motion vector of each piece and enhancement layer motion vector; With
Generation is included as the base layer motion vector of each piece selection or the bit stream of enhancement layer motion vector and texture information.
36, a kind of have a recording medium that is recorded in computer-readable program wherein, and this program is used to carry out method for video coding, and this method comprises:
Estimate the base layer motion vector and the enhancement layer motion vector of each piece in the frame of video;
Use the time redundancy in the enhancement layer motion vector removal frame of video;
Spatial alternation is removed the frame of video of time redundancy therein, and quantizes by the frame of video of spatial alternation, to obtain texture information; With
Generation comprises the base layer motion vector of each piece, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and the bit stream of texture information.
37, a kind of have a recording medium that is recorded in computer-readable program wherein, and this program is used to carry out pre-coding/pre-decoding method, and this method comprises:
Reception comprises the base layer motion vector of each piece of frame of video, as the residual motion vector of the difference of base layer motion vector and enhancement layer motion vector and related with each piece in the frame of video bit stream that passes through texture information that the frame of video coding is obtained; With
Delete at least a portion of cutting the residual motion vector.
38, a kind of have a recording medium that is recorded in computer-readable program wherein, and this program is used to carry out video encoding/decoding method, and this method comprises:
Explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and enhancement layer motion vector;
The recanalization base layer motion vector;
Texture information is carried out re-quantization and inverse spatial transform to obtain to have removed therein the frame of time redundancy; With
Use and the frame of removing time redundancy is therein carried out contrary motion compensation by the base layer motion vector of recanalization and enhancement layer motion vector.
39, a kind of have a recording medium that is recorded in computer-readable program wherein, and this program is used to carry out video encoding/decoding method, and this method comprises:
Explain incoming bit stream, and obtain texture information and the movable information that comprises base layer motion vector and residual motion vector;
For having the two each of a plurality of of base layer motion vector and residual motion vector, base layer motion vector and residual motion vector is synthetic, and obtain synthetic motion vector;
Texture information is carried out re-quantization and inverse spatial transform, and obtain to have removed therein the frame of time redundancy; With
Use synthetic motion vector and not synthetic base layer motion vector that the frame of removing time redundancy is therein carried out contrary motion compensation.
CNB2005100841402A 2004-07-15 2005-07-14 Video encoding and decoding methods and video encoder and decoder Expired - Fee Related CN100466735C (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US58790504P 2004-07-15 2004-07-15
US60/587,905 2004-07-15
KR1020040063198 2004-08-11
KR1020040118021 2004-12-31

Publications (2)

Publication Number Publication Date
CN1722839A true CN1722839A (en) 2006-01-18
CN100466735C CN100466735C (en) 2009-03-04

Family

ID=35912733

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005100841402A Expired - Fee Related CN100466735C (en) 2004-07-15 2005-07-14 Video encoding and decoding methods and video encoder and decoder

Country Status (1)

Country Link
CN (1) CN100466735C (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102090061A (en) * 2008-07-09 2011-06-08 苹果公司 Video streaming using multiple channels
CN108401157A (en) * 2012-10-01 2018-08-14 Ge视频压缩有限责任公司 Scalable video decoder, encoder and telescopic video decoding, coding method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103765891B (en) * 2011-08-29 2017-07-28 苗太平洋控股有限公司 Device for decoding the movable information under merging patterns

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6148026A (en) * 1997-01-08 2000-11-14 At&T Corp. Mesh node coding to enable object based functionalities within a motion compensated transform video coder
US6057884A (en) * 1997-06-05 2000-05-02 General Instrument Corporation Temporal and spatial scaleable coding for video object planes
JP2000228773A (en) * 1998-12-04 2000-08-15 Mitsubishi Electric Corp Video signal coder
EP1161839A1 (en) * 1999-12-28 2001-12-12 Koninklijke Philips Electronics N.V. Snr scalable video encoding method and corresponding decoding method
US6510177B1 (en) * 2000-03-24 2003-01-21 Microsoft Corporation System and method for layered video coding enhancement
EP1442601A1 (en) * 2001-10-26 2004-08-04 Koninklijke Philips Electronics N.V. Method and appartus for spatial scalable compression

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102090061A (en) * 2008-07-09 2011-06-08 苹果公司 Video streaming using multiple channels
CN102090061B (en) * 2008-07-09 2014-05-07 苹果公司 Video streaming using multiple channels
CN108401157A (en) * 2012-10-01 2018-08-14 Ge视频压缩有限责任公司 Scalable video decoder, encoder and telescopic video decoding, coding method
CN108401157B (en) * 2012-10-01 2022-06-24 Ge视频压缩有限责任公司 Scalable video decoder, scalable video encoder, and scalable video decoding and encoding methods
US11477467B2 (en) 2012-10-01 2022-10-18 Ge Video Compression, Llc Scalable video coding using derivation of subblock subdivision for prediction from base layer
US11575921B2 (en) 2012-10-01 2023-02-07 Ge Video Compression, Llc Scalable video coding using inter-layer prediction of spatial intra prediction parameters
US11589062B2 (en) 2012-10-01 2023-02-21 Ge Video Compression, Llc Scalable video coding using subblock-based coding of transform coefficient blocks in the enhancement layer

Also Published As

Publication number Publication date
CN100466735C (en) 2009-03-04

Similar Documents

Publication Publication Date Title
CN1722836A (en) Video coding and coding/decoding method and video encoder and decoder
CN1722831A (en) To comprising basic layer the bit stream pre decoding and the method and apparatus of decoding
JP5203503B2 (en) Bit depth scalability
CN1231863C (en) Method and apparatus for compressing and decompressing image
CN1906945A (en) Method and apparatus for scalable video encoding and decoding
CN1914921A (en) Apparatus and method for scalable video coding providing scalability in encoder part
CN101036388A (en) Method and apparatus for predecoding hybrid bitstream
CN1722838A (en) Use the scalable video coding method and apparatus of basal layer
CN101049026A (en) Scalable video coding with grid motion estimation and compensation
CN1930890A (en) Method and apparatus for scalable video coding and decoding
CN1910924A (en) Video coding apparatus and method for inserting key frame adaptively
CN1961582A (en) Method and apparatus for effectively compressing motion vectors in multi-layer structure
CN1625265A (en) Method and apparatus for scalable video encoding and decoding
US20060013309A1 (en) Video encoding and decoding methods and video encoder and decoder
CN101069429A (en) Method and apparatus for multi-layered video encoding and decoding
CN101069430A (en) Scalable multi-view image encoding and decoding apparatuses and methods
CN1926874A (en) Method and apparatus for video coding, predecoding, and video decoding for video streaming service, and image filtering method
CN1943244A (en) Inter-frame prediction method in video coding, video encoder, video decoding method, and video decoder
CN1812580A (en) Deblocking control method considering intra bl mode and multilayer video encoder/decoder using the same
CN1574970A (en) Method and apparatus for encoding/decoding image using image residue prediction
CN1947426A (en) Method and apparatus for implementing motion scalability
CN1247670A (en) Appts. and method for optimizing rate control in coding system
CN1951122A (en) Scalable video coding method supporting variable gop size and scalable video encoder
KR100834749B1 (en) Device and method for playing scalable video streams
CN1921627A (en) Video data compaction coding method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090304