EP0888011B1 - Scalable predictive contour coding method and apparatus - Google Patents

Scalable predictive contour coding method and apparatus Download PDF

Info

Publication number
EP0888011B1
EP0888011B1 EP19970306223 EP97306223A EP0888011B1 EP 0888011 B1 EP0888011 B1 EP 0888011B1 EP 19970306223 EP19970306223 EP 19970306223 EP 97306223 A EP97306223 A EP 97306223A EP 0888011 B1 EP0888011 B1 EP 0888011B1
Authority
EP
European Patent Office
Prior art keywords
primary
vertices
contour
vertex
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP19970306223
Other languages
German (de)
English (en)
French (fr)
Other versions
EP0888011A2 (en
EP0888011A3 (en
Inventor
Seok-Won Han
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WiniaDaewoo Co Ltd
Original Assignee
Daewoo Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daewoo Electronics Co Ltd filed Critical Daewoo Electronics Co Ltd
Publication of EP0888011A2 publication Critical patent/EP0888011A2/en
Publication of EP0888011A3 publication Critical patent/EP0888011A3/en
Application granted granted Critical
Publication of EP0888011B1 publication Critical patent/EP0888011B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/20Contour coding, e.g. using detection of edges
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/20Circuitry for controlling amplitude response
    • H04N5/205Circuitry for controlling amplitude response for correcting amplitude versus frequency characteristic
    • H04N5/208Circuitry for controlling amplitude response for correcting amplitude versus frequency characteristic for compensating for attenuation of high frequency components, e.g. crispening, aperture distortion correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/537Motion estimation other than block-based
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/537Motion estimation other than block-based
    • H04N19/543Motion estimation other than block-based using regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability

Definitions

  • the present invention relates to a method and apparatus for inter-contour coding a contour of an object in a video signal with scalability; and, more particularly, to a method and apparatus capable of sequentially improving the quality of a video image by transmitting a base layer first and enhancement layers thereafter and also capable of reducing the amount of transmission data through the use of a contour motion estimation technique.
  • WO 97/15902 A discloses a method for scalably intra-contour coding a video signal, comprising the steps of: generating a predicted contour, sampling the difference between the contour and the predicted contour, and coding the samples into a plurality of layers.
  • n is an integer greater than 2.
  • a decoder does not decode layers from an (n+1)st layer if there is no need to improve the quality of the video image any further. In other words, a decoder decodes just as many layers as needed, thereby reducing the bit loss resulting from transmitting overly accurate or finely divided video images.
  • a base layer L 0 contains 5 vertices P 0 (1) to P 0 (5)
  • a first enhancement layer L 1 contains 6 vertices P 1 (1) to P 1 (6)
  • a second enhancement layer L 2 contains 4 vertices P 2 (1) to P 2 (4).
  • Fig. 1A, 1B, and 1C there are illustrated exemplary diagrams of the scalcable intra-contour coding technique.
  • a base layer L 0 contains 5 vertices P 0 (1) to P 0 (5)
  • a first enhancement layer L 1 contains 6 vertices P 1 (1) to P 1 (6)
  • a second enhancement layer L 2 contains 4 vertices P 2 (1) to P 2 (4).
  • P 1 (1) is located between P 0 (1) and P 0 (2)
  • P 1 (2) is located between P 0 (2) and P 0 (3)
  • P 1 (3) is located between P 0 (3) and P 0 (4)
  • both P 1 (4) and P 1 (5) are located between P 0 (4) and P 0 (5)
  • P 1 (6) is located between P 0 (5) and P 0 (1)
  • P 2 (1) is located between P 0 (1) and P 1 (1)
  • P 2 (2) is located between P 1 (1) and P 0 (2)
  • P 2 (3) is located between P 1 (4) and P 1 (5)
  • P 2 (4) is located between P 0 (5) and P 1 (6).
  • bit streams B i 's each of which corresponds to each of the layers L i 's, comprise vertex sequence information and vertex position information, wherein i is an integer ranging from 0 to 2.
  • the bit stream B 0 of the base layer L 0 contains the vertex position information indicating the position of P 0 (1) to P 0 (5).
  • the bit stream B 1 of the first enhancement layer L 1 contains the vertex sequence information 1, 1, 1, 2, 1, and the vertex position information P 1 (1) to P 1 (6), wherein the vertex sequence information 1, 1, 1, 2, 1 indicates the number of vertices of the first enhancement layer L 1 to be inserted between two adjacent vertices of the base layer L 0 .
  • 1 vertex is inserted between P 0 (1) and P 0 (2), 1 vertex between P 0 (2) and P 0 (3), 1 vertex between P 0 (3) and P 0 (4), 2 vertices are inserted between P 0 (4) and P 0 (5), and 1 vertex is inserted between P 0 (5) and P 0 (1); and the sequence of insertion is in the order of P 1 (1), P 1 (2), P 1 (3), P 1 (4), P 1 (5) and P 1 (6).
  • the bit stream B 2 of the second enhancement layer L 2 contains the vertex sequence information 1, 1, 0, 0, 0, 0, 0, 1, 0, and the vertex position information P 2 (1) to P 2 (4), wherein the vertex sequence information 1, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0 indicates the number of vertices of the second enhancement layer L 2 to be inserted between two adjacent vertices of the base layer L 0 and the first enhancement layer L 1 . That is, 1 vertex is inserted between P 0 (i) and P 1 (1), 1 vertex is inserted between P 1 (1) and P 0 (2), no vertex is inserted between P 0 (2) and P 1 (2), and so on; and the sequence of insertion is in the order of P 2 (1), P 2 (2), P 2 (3), and P 2 (4).
  • the shape of a restored video image becomes closer to an original shape of a contour as more layers are decoded.
  • the scalable intra-contour coding technique is capable of reducing a bit loss resulting from transmitting an overly accurate or sequented video image, it does not take into account the temporal redundancy between two consecutive frames. Thus, it still remains-desirable to further reduce the volume of transmission data through the use of a contour motion estimation technique in order to more effectively implement a low-bit rate codec system having, e.g., a 64 kb/s transmission channel bandwidth.
  • a method for scalably inter-contour coding a video signal having of a previous and a current frames, wherein each of the previous and current frame contains a contour comprising the steps of : generating a predicted contour by motion estimating and compensating a previous base layer; widening the predicted contour as much as a predetermined threshold D max (j); generating (j)th primary and secondary vertices; reconstructing a current base layer; widening the predicted contour as much as a predetermined threshold D max (j+1); generating (j+1)st primary and secondary vertices; updating the (j)th primary and secondary vertices.
  • the (j)th primary and secondary vertices are encoded; and the encoded primary and secondary vertex information is formatted in a predetermined way.
  • the value of j is increased by 1, and steps for widening the predicted contour as much as D max (j+1) to steps for formatting the encoded primary and secondary vertex information are repeated until j becomes N, N being a positive integer.
  • FIG. 2 there is shown a block diagram of an apparatus 1 for encoding a contour in accordance with the present invention.
  • the motion estimation unit 10 determines a motion vector MV by motion estimation.
  • a previous base layer is displaced on a pixel-by-pixel basis to thereby overlap with the current contour, and the pixels which belong to both of the current contour and the previous base layer are counted at each displacement.
  • the motion vector MV on a line L20 is provided to a motion compensation unit 20 and a formatter 90.
  • the motion compensation unit 20 generates a predicted contour. That is, all the pixels on the previous base layer are moved in the direction of the current contour by the motion vector MV to thereby generate the predicted contour.
  • the predicted contour is provided to the widening unit 30.
  • the widening unit 30 overlaps the predicted contour with the current contour, detects coincidence portions, wherein the coincidence potion is a portion of the current contour where the predicted contour overlaps with the current contour, and determines coincidence vertices. If the coincidence portion contains more than one pixel, end points of the coincidence portion are determined as the coincidence vertices. If the coincidence portion contains only one pixel, the very pixel is determined as the coincidence vertex.
  • exemplary coincidence vertices A to Z are generated.
  • A, B, F, G, L, M, N, O, R, S, W, and X are the exemplary coincidence vertices, each being an end point of a corresponding exemplary coincidence portion, wherein each exemplary coincidence portion is a portion of an exemplary current contour 34 that overlaps with the exemplary predicted contour 32.
  • C, D, E, H, I, J, K, P, Q, T, U, V, Y, and Z are the exemplary coincidence vertices at which the exemplary predicted contour 32 merely crosses the exemplary current contour 34.
  • the widening unit 30 widens the predicted contour by a predetermined threshold D max (j) to thereby generate a first widened contour band, wherein j is 0.
  • D max (j) decreases as j increases, wherein D max (j) is nonnegative.
  • the exemplary predicted contour 32 is widened by D max (0), to thereby generate a first exemplary widened contour band 32'.
  • the widening unit 30 provides information on the current and predicted contours, the first widened contour band, and the coincidence vertices to a matched segment determination unit 40.
  • the matched segment determination unit 40 detects first matching portions.
  • Each of the first matching portions represents a portion of the current contour overlapping with the first widened contour band between two coincidence vertices, and two adjacent first matching portions which have one end point in common are considered as one first matching portion.
  • the matched segment determination unit 40 compares each length of the first matching portions with a predetermined threshold TH, to thereby select first matching portions whose lengths are longer than the predetermined threshold TH as first matched segments. End points of the first matched segments are determined as first primary vertices.
  • portions from A to I, K to O, R to U, and V to Z are exemplary first matched segments of the exemplary current contour 34
  • A, I, K, O, R, U, V, and Z are exemplary first primary vertices of the exemplary current contour 34.
  • the secondary vertex determination unit 50 determines first unmatched segments, wherein the first unmatched segment is a segment on the current contour that does not belong to any of the first matched segments. Then, first secondary vertices are determined on each of the first unmatched segments on the current contour through the use of the conventional polygonal approximation technique based on the predetermined threshold D max (j), wherein the predetermined threshold D max (j) is the same threshold used at the widening unit 30.
  • a contour pixel on any first unmatched segment which has a largest distance to a line segment corresponding thereto is determined as a first start vertex when the largest distance is greater than the D max (j). Those first start vertices determined by the polygonal approximation are defined as first secondary vertices.
  • portions from I to K, O to R, U to V, and Z to A are exemplary first unmatched segments on the exemplary current contour 34, and 1V 1 to 1V 7 are exemplary first secondary vertices of the exemplary current contour 34.
  • the secondary vertex determination unit 50 provides information on the current and predicted contours, the first primary and the first secondary vertices, and the first matched and the first unmatched segments to an overlapped contour memory 60.
  • the overlapped contour memory 60 gives indices to the first primary vertices to thereby store information on the first secondary and indexed first primary vertices as a first overlapped contour OC(1).
  • 1S 1 indicates a first start point of the first matched segment
  • 1S 2 indicates a second start point of the first matched segment, and so on
  • 1E 1 indicates a first end point of the first matched segment
  • 1E 2 indicates a second end point of the first matched segment, and so on.
  • an exemplary current base layer 36 comprises thick lines which are the exemplary first matched segments, and thin lines which connect two adjacent exemplary primary or secondary vertices.
  • the reconstructed current base layer is stored in a base layer storage unit 110.
  • the overlapped contour memory 60 increases j by 1 and provides information on the current and the predicted contours, and the first overlapped contour OC(1) to the widening unit 30 via a line L50.
  • the widening unit 30 widens the predicted contour from the overlapped contour memory 60 by a predetermined threshold D max (j) to thereby generate a second widened contour band, wherein j is 0 and D max (1) is smaller than Dmax (0).
  • the widening unit 30 provides information on the current and the predicted contours, the second widened contour band, and the first overlapped contour OC(1) to the matched segment determination unit 40.
  • the operation of the matched segment determination unit 40 for the case when j is 1 is similar to that for the case when j is 0. That is, portions where the second widened contour band coincides with the current contour are determined as second matching portions. Thereafter the second matching portions whose lengths are longer than the predetermined threshold TH are determined as second matched segments and end points of the second matched segments are determined as second primary vertices.
  • the secondary vertex determination unit 50 determines second unmatched segments, wherein the second unmatched segment is a portion on the current contour between two adjacent points among the first primary vertices, the first secondary vertices and the second primary vertices that does not belong to any of the second matched segments.
  • second secondary vertices are determined on each of the second unmatched segments on the current contour through the use of the conventional polygonal approximation technique based on the predetermined threshold D max (j), wherein the predetermined threshold D max (j) is the same threshold used at the widening unit 30.
  • the conventional polygonal approximation technique a contour pixel on any second unmatched segment which has a largest distance to a line segment corresponding thereto is determined as a second start vertex when the largest distance is greater than the predetermined threshold D max (j).
  • Those second start vertices determined by the polygonal approximation are referred as second secondary vertices.
  • the secondary vertex determination unit 50 provides information on the current and the predicted contours, the second primary and the second secondary vertices, and the second matched and the second unmatched segments to the overlapped contour memory 60.
  • the overlapped contour memory 60 gives indices to the second primary vertices to thereby store information on the second secondary and indexed second primary vertices as a second overlapped contour OC(2).
  • 2S 1 indicates a first start point of the second matched segment
  • 2S 2 indicates a second start point of the second matched segment, and so on
  • 2E 1 indicates a first end point of the second matched segment
  • 2E 2 indicates a second end point of the second matched segment, and so on.
  • the overlapped contour memory 60 After storing the second overlapped contour OC(2), the overlapped contour memory 60 updates the first overlapped contour OC(1).
  • the overlapped contour memory 60 gives indices to such points, which indicates that the corresponding points are first start-end points.
  • the first start-end points are the points which are the first primary vertices but not the second primary vertices.
  • 1SE 1 indicates a first start-end point of the base layer
  • 1SE 2 indicates a second start-end point of the base layer
  • An updated first overlapped contour OC'(1) is stored in the overlapped contour memory 60.
  • FIG. 6A depicts the exemplary coincidence vertices A to Z, and the exemplary coincidence portions AB, FG, LM, NO, RS and WX in accordance with the preferred embodiment of the present invention.
  • Fig. 6B illustrates the exemplary first primary vertices A, I, K, O, R, U, V and Z, the exemplary first secondary vertices IV 1 to 1V 7 , and the exemplary first matched segments AI, LO, RU and VZ resulted from the construction of the exemplary current base layer.
  • Fig. 6A depicts the exemplary coincidence vertices A to Z, and the exemplary coincidence portions AB, FG, LM, NO, RS and WX in accordance with the preferred embodiment of the present invention.
  • Fig. 6B illustrates the exemplary first primary vertices A, I, K, O, R, U, V and Z, the exemplary first secondary vertices IV 1 to 1V 7 , and the exemplary first matched segments AI, LO,
  • FIG. 6C is an exemplary first overlapped contour OC(1) stored in the overlapped contour memory 60 when j is 0, and 1S 1 to 1S 4 are exemplary start points of the exemplary first matched segments and 1E 1 to 1E 4 are exemplary end points of the exemplary first umatched segments.
  • Fig. 6D represents exemplary second primary vertices A, C, E, H, L, O, R, S, W and Y, and exemplary second matched segments AC, EH, LO, RS and WY as the result of the contour widening by D max (1).
  • FIG. 6E shows exemplary second unmatched segments CE, HI, I1V 1 , IV 1 1V 2 , 1V 2 K, KL, O1V 3 , 1V 3 1V 4 , 1V 4 1V 5 , 1V 5 R, SU, U1V 6 , 1V 6 V, VW, YZ, Z1V 7 and 1V 7 A.
  • Fig. 6F illustrates exemplary second secondary vertices 2V 1 to 2V 15 .
  • Fig. 6G is an exemplary second overlapped contour OC(2) stored in the overlapped contour memory 60 when j is 1, and 2S 1 to 2S 5 are exemplary start points of the exemplary second matched segments, and 2E 1 to 2E 5 is exemplary end points of the exemplary second matched segments.
  • Fig. 6H is an exemplary first updated overlapped contour OC'(1) stored in the overlapped contour memory 60 when j is 1.
  • the overlapped contour memory 60 provides information on the predicted contour and the first primary vertices to a primary vertex coding unit 70 via a line L60, information on the first secondary vertices and the updated first overlapped contour OC'(1) to a secondary vertex coding unit 80 via a line L70. Since, however, j is 1, the overlapped contour memory 60 does not provide the information on the first matched segment and the first secondary vertex to the base layer reconstruction unit 100 via the line L40.
  • the primary vertex coding unit 70 determines a reference point RP, by using a predetermined method, for example, raster scanning, calculates length of each segment between two adjacent first primary vertices, and encodes information on the first primary vertices by using the Reference Contour Based coding method in a predetermined direction, for instance, clockwise.
  • a predetermined method for example, raster scanning
  • the number of the pixels constituting each segment is defined as the length of each segment.
  • a first primary vertex next to RP is encoded by representing the length from RP to itself by using n bits. Thereafter, remaining vertices are encoded as follows : If the length from RP to a first primary vertex just decoded is longer than m, the next first primary vertex is encoded by representing the length from the first primary vertex just decoded to itself by using n bits.
  • the next first primary vertex is encoded by representing the length from the first primary vertex just decoded to itself by using n' bits, wherein n' bit can afford to represent the length from the first primary vertex just decoded to RP.
  • the length of the segment AI namely, m1 which is longer than or equal to 2 n1 and shorter than 2 n1+1 is longest, the length from U to is longer than m1 and the length from V to RP is shorter than m1.
  • the exemplary first primary vertices A to U are encoded by representing the length between two adjacent exemplary first primary vertices by using n1 bits
  • V is encoded by representing the length from U to V by using n1' bits which can afford to represent the length from U to RP
  • Z is encoded by representing the length from V to Z by using n1" bits which can afford to represent the length from V to RP.
  • the secondary vertex encoding unit 80 encodes the first secondary vertices.
  • Each of the first secondary vertices is encoded by representing a 2 dimensional displacement from a closest first primary or secondary vertex located counterclockwise to itself.
  • 1V 1 is encoded by representing the displacement from I to 1V 1
  • 1V 2 is encoded by representing the displacement from 1V 1 to 1V 2
  • 1V 3 is encoded by representing the displacement from O to 1V 3 , and so on.
  • Encoded information on the first primary and secondary vertices is provided to the formatter 90.
  • the formatter 90 formats the encoded information on the first primary and secondary vertices in a predetermined way based on the updated first overlapped contour OC'(1). That is, the formatter 90 formats the encoded information in the order of the motion vector MV, the number of the primary vertices, the number of the start-end points among the primary vertices, numbers indicating the start-end points, the numbers of the secondary vertices between the unmatched segments, the encoded primary and secondary vertex information, to thereby provide it to a transmitter(not shown).
  • formatted current base layer information is MV, 8, 3, 2, 6, 8, 2, 3, 1, 1, A, I, K, O, R, U, V, Z, 2, 6, 8, 1V 1 , 1V 2 , 1V 3 , 1V 4 , 1V 5 , 1V 6 , 1V 7 .
  • a decoder(not shown) can restore the current base layer by decoding formatted information and by using the previous base layer decoded before.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
EP19970306223 1997-06-26 1997-08-15 Scalable predictive contour coding method and apparatus Expired - Lifetime EP0888011B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1019970027561A KR100244769B1 (ko) 1997-06-26 1997-06-26 스케일러빌리티를 갖는 간 윤곽선 부호화 방법 및 장치
KR9727561 1997-06-26

Publications (3)

Publication Number Publication Date
EP0888011A2 EP0888011A2 (en) 1998-12-30
EP0888011A3 EP0888011A3 (en) 2000-12-27
EP0888011B1 true EP0888011B1 (en) 2007-03-07

Family

ID=19511346

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19970306223 Expired - Lifetime EP0888011B1 (en) 1997-06-26 1997-08-15 Scalable predictive contour coding method and apparatus

Country Status (7)

Country Link
US (1) US6097756A (zh)
EP (1) EP0888011B1 (zh)
JP (1) JPH1127682A (zh)
KR (1) KR100244769B1 (zh)
CN (1) CN1149852C (zh)
DE (1) DE69737443T2 (zh)
IN (1) IN191385B (zh)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100373331B1 (ko) * 1997-07-02 2003-04-21 주식회사 팬택앤큐리텔 스캔 인터리빙 방법을 이용한 신축형 모양정보 부호화/복호화장치 및 방법
KR19990008977A (ko) * 1997-07-05 1999-02-05 배순훈 윤곽선 부호화 방법
US6714679B1 (en) * 1998-02-05 2004-03-30 Cognex Corporation Boundary analyzer
JP2001266159A (ja) * 2000-03-17 2001-09-28 Toshiba Corp 物体領域情報生成方法及び物体領域情報生成装置並びに近似多角形生成方法及び近似多角形生成装置
WO2003021970A1 (en) * 2001-09-04 2003-03-13 Faroudja Cognition Systems, Inc. Low bandwidth video compression
JP4002502B2 (ja) * 2001-11-27 2007-11-07 三星電子株式会社 座標インタポレータの符号化/復号化装置及びその方法
US7809204B2 (en) * 2002-10-18 2010-10-05 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key value data of coordinate interpolator
KR100885443B1 (ko) * 2005-04-06 2009-02-24 엘지전자 주식회사 레이어간 예측방식를 사용해 엔코딩된 영상신호를디코딩하는 방법
JP4991760B2 (ja) * 2006-01-09 2012-08-01 エルジー エレクトロニクス インコーポレイティド 映像信号の層間予測方法
KR100937590B1 (ko) * 2007-10-23 2010-01-20 한국전자통신연구원 다중 품질 서비스 영상 콘텐츠 제공 시스템 및 그것의업그레이드 방법
EP2278550B1 (en) * 2009-06-17 2013-08-14 Canon Kabushiki Kaisha Method of encoding and decoding a graphics path sequence into a layered scheme
US20140098880A1 (en) * 2012-10-05 2014-04-10 Qualcomm Incorporated Prediction mode information upsampling for scalable video coding
EP3122051A1 (en) 2015-07-24 2017-01-25 Alcatel Lucent Method and apparatus for encoding and decoding a video signal based on vectorised spatiotemporal surfaces
KR20220003511A (ko) * 2019-03-20 2022-01-10 브이-노바 인터내셔널 리미티드 낮은 복잡도 향상 비디오 코딩
EP3962082A4 (en) * 2019-06-12 2022-10-05 Sony Group Corporation IMAGE PROCESSING DEVICE AND METHOD
CN114710661A (zh) * 2019-09-20 2022-07-05 杭州海康威视数字技术股份有限公司 一种解码、编码方法、装置及其设备
KR20210034534A (ko) * 2019-09-20 2021-03-30 한국전자통신연구원 영상 부호화/복호화 방법, 장치 및 비트스트림을 저장한 기록 매체
CN112106362A (zh) * 2019-09-30 2020-12-18 深圳市大疆创新科技有限公司 可移动平台的图像处理方法、装置、可移动平台及介质
WO2021093837A1 (en) * 2019-11-15 2021-05-20 Mediatek Inc. Method and apparatus for signaling horizontal wraparound motion compensation in vr360 video coding
US12022126B2 (en) * 2019-11-22 2024-06-25 Sharp Kabushiki Kaisha Systems and methods for signaling tiles and slices in video coding
KR20220146647A (ko) * 2020-05-19 2022-11-01 구글 엘엘씨 품질-정규화된 비디오 트랜스코딩을 위한 동적 파라미터 선택
US20230412812A1 (en) * 2022-06-15 2023-12-21 Tencent America LLC Systems and methods for joint signaling of transform coefficient signs

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3038143B2 (ja) * 1994-12-29 2000-05-08 現代電子産業株式会社 映像機器の物体別形状情報の減縮装置及びその減縮方法並びに多角近似化方法
US5764808A (en) * 1995-10-26 1998-06-09 Motorola, Inc. Method and device for compact representation of a discrete region contour
US5748789A (en) * 1996-10-31 1998-05-05 Microsoft Corporation Transparent block skipping in object-based video coding systems
KR100229544B1 (ko) * 1997-04-11 1999-11-15 전주범 움직임 추정기법을 이용한 윤곽선 부호화 장치

Also Published As

Publication number Publication date
EP0888011A2 (en) 1998-12-30
EP0888011A3 (en) 2000-12-27
JPH1127682A (ja) 1999-01-29
IN191385B (zh) 2003-11-29
CN1149852C (zh) 2004-05-12
DE69737443T2 (de) 2007-11-15
US6097756A (en) 2000-08-01
KR19990003658A (ko) 1999-01-15
KR100244769B1 (ko) 2000-02-15
DE69737443D1 (de) 2007-04-19
CN1204215A (zh) 1999-01-06

Similar Documents

Publication Publication Date Title
EP0888011B1 (en) Scalable predictive contour coding method and apparatus
CN1115879C (zh) 利用逐个象素运动估算与帧抽取的图象处理系统
EP0889652B1 (en) Method and apparatus for encoding a contour of an object based on a contour motion estimation technique
US6272253B1 (en) Content-based video compression
EP0801504B1 (en) Method for encoding a contour of an object in a video signal by using a contour motion estimation technique
US6879633B2 (en) Method and apparatus for efficient video processing
JP2883321B2 (ja) エラー帯を用いた輪郭線符号化方法
US6128041A (en) Method and apparatus for binary shape encoding
CA2341208C (en) Method of multichannel data compression
US9060172B2 (en) Methods and systems for mixed spatial resolution video compression
JPH11513205A (ja) ビデオ符号化装置
EP0813342B1 (en) Method and apparatus for encoding a contour of an object in a video signal by using a contour motion estimation technique
EP0884693A2 (en) Interpolation method for binary image
EP0806742A1 (en) Adaptive contour coding
JPH10290466A (ja) 輪郭線符号化装置
EP1180308A1 (en) Method and apparatus for efficient video processing
JPH09261660A (ja) 輪郭線符号化方法及び輪郭線符号化装置
JP2000503174A (ja) 動き推定のための方法
Sanderson et al. Image segmentation for compression of images and image sequences
GB2320386A (en) Encoding the contour of an object in a video signal encoder
US6181747B1 (en) Methods and systems for high compression rate encoding and decoding of quasi-stable objects in video and film
KR100275543B1 (ko) 고속 변환파라미터 검출 장치 및 검출 방법
Melnikov Operationally optimal shape coding
JPH07107459A (ja) 画像符号化装置
JPH02206991A (ja) 動き補償処理方式およびフレーム間符号化処理方式

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): CH DE FR GB IT LI NL

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

RIC1 Information provided on ipc code assigned before grant

Free format text: 7H 04N 7/26 A, 7G 06T 9/20 B, 7H 04N 7/36 B

17P Request for examination filed

Effective date: 20010605

AKX Designation fees paid

Free format text: CH DE FR GB IT LI NL

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: DAEWOO ELECTRONICS CORPORATION

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): CH DE FR GB IT LI NL

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REF Corresponds to:

Ref document number: 69737443

Country of ref document: DE

Date of ref document: 20070419

Kind code of ref document: P

REG Reference to a national code

Ref country code: CH

Ref legal event code: NV

Representative=s name: BOVARD AG PATENTANWAELTE

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20071210

REG Reference to a national code

Ref country code: CH

Ref legal event code: PFA

Owner name: DAEWOO ELECTRONICS CORPORATION

Free format text: DAEWOO ELECTRONICS CORPORATION#686, AHYEON-DONG MAPO-GU#SEOUL 100-095 (KR) -TRANSFER TO- DAEWOO ELECTRONICS CORPORATION#686, AHYEON-DONG MAPO-GU#SEOUL 100-095 (KR)

REG Reference to a national code

Ref country code: CH

Ref legal event code: PUE

Owner name: MAPLE VISION TECHNOLOGIES INC., CA

Free format text: FORMER OWNER: DAEWOO ELECTRONICS CORPORATION, KR

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69737443

Country of ref document: DE

Representative=s name: KLUNKER, SCHMITT-NILSON, HIRSCH, DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20130404 AND 20130410

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69737443

Country of ref document: DE

Representative=s name: KLUNKER, SCHMITT-NILSON, HIRSCH, DE

Effective date: 20130313

Ref country code: DE

Ref legal event code: R081

Ref document number: 69737443

Country of ref document: DE

Owner name: MAPLE VISION TECHNOLOGIES INC., CA

Free format text: FORMER OWNER: DAEWOO ELECTRONICS CORP., SEOUL/SOUL, KR

Effective date: 20130313

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

Owner name: MAPLE VISION TECHNOLOGIES INC., CA

Effective date: 20131226

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20140809

Year of fee payment: 18

Ref country code: CH

Payment date: 20140812

Year of fee payment: 18

Ref country code: DE

Payment date: 20140813

Year of fee payment: 18

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20140813

Year of fee payment: 18

Ref country code: FR

Payment date: 20140808

Year of fee payment: 18

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20140818

Year of fee payment: 18

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 69737443

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20150815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150831

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150815

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150831

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20150901

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20160429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160301

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150831