WO2010079682A1 - Procédé de compression d'image, appareil de traitement d'image, appareil d'affichage d'image et système d'affichage d'image - Google Patents

Procédé de compression d'image, appareil de traitement d'image, appareil d'affichage d'image et système d'affichage d'image Download PDF

Info

Publication number
WO2010079682A1
WO2010079682A1 PCT/JP2009/071202 JP2009071202W WO2010079682A1 WO 2010079682 A1 WO2010079682 A1 WO 2010079682A1 JP 2009071202 W JP2009071202 W JP 2009071202W WO 2010079682 A1 WO2010079682 A1 WO 2010079682A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
external
local background
external image
unit
Prior art date
Application number
PCT/JP2009/071202
Other languages
English (en)
Japanese (ja)
Inventor
望 仲尾
Original Assignee
コニカミノルタホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタホールディングス株式会社 filed Critical コニカミノルタホールディングス株式会社
Priority to JP2010513212A priority Critical patent/JP4573001B2/ja
Publication of WO2010079682A1 publication Critical patent/WO2010079682A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/02Flexible displays

Definitions

  • the present invention relates to a compression method for compressing an input external image, and an image processing device, an image display device, and an image display system for compressing an image by the compression method.
  • a compression method for compressing an input external image and an image processing device, an image display device, and an image display system for compressing an image by the compression method.
  • an image display device that allows an observer to visually recognize the external image and the real image through the display unit at the same time, for example, a see-through type head mounted display (hereinafter referred to as HMD).
  • HMD see-through type head mounted display
  • the HMD disclosed in Patent Document 1 includes a transparent display unit, and displays an external image that is an image input from the outside on the display unit, thereby simultaneously observing an external image and a real image through the display unit.
  • See-through type HMD the HMD described in Patent Literature 2 captures an optical image through a display unit to obtain a real image, and generates a display image by synthesizing an external image with the real image and displays the display image on the display unit.
  • Video see-through HMD Video see-through HMD
  • an external image corresponding to a real image is created based on a part of information of the real image.
  • the HMD described in Patent Literature 3 includes an imaging unit that generates a real image by imaging, and communicates the real image and the external image with an external image processing apparatus.
  • JP 2000-184370 A Japanese Patent Laid-Open No. 2007-219082 JP-A-5-21650
  • the present invention provides a compression method capable of suppressing deterioration in image quality and reducing the transmission amount, an image processing device, an image display device, and an image display system that perform image compression using the compression method. With the goal.
  • the video display device of the present invention compares an external image input from the outside of the device and a local background image showing a real image visually recognized by an observer for each corresponding position, and the local background image of the external image
  • An external image update processing unit that sets an area similar to the target area and performs update processing that is processing for reducing the sharpness on the target area of the external image, and the external image after the update processing
  • an external image encoding unit for encoding.
  • the update process performed by the external image update processing unit may be a smoothing process.
  • the signal value indicating the high-frequency component that is reduced by the smoothing process is efficiently omitted. Therefore, it is possible to reduce the information amount of the external image. For example, when an external image is converted into a signal sequence composed of frequency domain signals and a smoothing process is applied when the signal sequence is omitted, the amount of information can be suitably reduced. Note that this signal sequence is obtained by arranging signal values according to the magnitude of the frequency component, and adjacent equal signal values may be abbreviated.
  • the update process performed by the external image update processing unit may be a resolution reduction process.
  • the number of pixels in the target area is reduced, so that the amount of information can be reduced.
  • the pixel value of the pixel of the external image includes a plurality of components
  • the external image update processing unit clearly displays a component that is sensitive to humans among the components.
  • the update process with a small effect of reducing the degree is performed or the update process is not performed, and the update process with a large effect of reducing the sharpness is performed on a component that is insensitive to humans among the components. It doesn't matter.
  • the external image update processing unit compares the local background image and the external image for each block that is an area having a predetermined size, and among the blocks of the external image. A thing similar to the block of the local background image may be set as the target area.
  • the external image update processing unit compares the representative values of the blocks of the external image and the local background image, and the representative value is the median of the pixel values of the pixels included in the block. The average value of the pixel values of the pixels included in the block and the value of the direct current component obtained by converting the pixel value into a frequency domain signal may be used.
  • This configuration makes it possible to easily set the target area in units of blocks.
  • the block when the variance of the pixel values of the pixels belonging to the block of the external image is larger than a predetermined value, the block may not be set as the target region.
  • the external image update processing unit divides each of the external image and the local background image into regions where pixel values are substantially equal, and pixel values of pixels belonging to the region.
  • a region that overlaps a region with a substantially equal label may be set as a target region.
  • This configuration makes it possible to set the target area with high accuracy. In particular, it is possible to set a target region having a pixel unit boundary.
  • the target region may be set by being divided for each region. Absent.
  • the external image update processing unit includes each of the target area set in the external image and a corresponding area corresponding to the target area of the local background image. Linear separation may be performed on the pixel values of the pixels, and the target area may be excluded when separation is possible.
  • This configuration makes it possible to exclude the target area when the set target area is not similar and inappropriate. Therefore, it is possible to set the target area with higher accuracy.
  • the image display system of the present invention includes the image processing device described above and an image display device that communicates with the image processing device, and the image display device is subjected to transmission processing from the image processing device.
  • An external image reception processing unit that receives and processes the external image after processing, a display unit that displays the external image after the update processing and allows a real image to pass therethrough, and shows the real image by imaging An imaging unit that generates a real image; and a real image transmission processing unit that transmits the real image to the image processing device, and the image processing device transmits the real image to be transmitted from the image display device.
  • the local background image which is an image showing a superposed portion of the real image that is visually recognized through the external image after the update process displayed on the display unit and a real image reception processing unit that performs reception processing , And further comprising a local background image generating unit that generates from the real image, the external image transmission processing unit for transmitting processing the external image after the updating process on the image display device.
  • the image processing apparatus can record a real image, transmit it to another apparatus, and display it.
  • An image display system of the present invention includes the image processing device according to any one of claims 1 to 11 and an image display device that communicates with the image processing device, wherein the image display device includes: An external image reception processing unit that receives and processes the external image after the update process that is transmitted from the image processing apparatus, and displays the external image after the update process and allows a real image to pass through to be visible.
  • a local background image generation unit that generates a background image from the real image; and a local background image transmission processing unit that transmits the local background image to the image processing device.
  • a local background image reception processing unit for receiving the local background image transmitted from the image display device; and an external image transmission processing unit for transmitting the external image after the update process to the image display device. It is characterized by that.
  • the image transmitted from the image display apparatus to the image processing apparatus becomes a local background image. Therefore, it is possible to reduce the transmission amount as compared with the case where a real image is transmitted.
  • the image display device of the present invention is an image display device that displays an image, wherein the image display device communicates an image with another image processing device having the same configuration as the image display device, An external image generating unit for other devices that generates an external image to be displayed on another image display device, and a local background for other devices that receives and processes a local background image for other devices that is transmitted from the other image display device
  • the image reception processing unit is compared with the external image for the other device and the local background image for the other device for each corresponding position, and the region similar to the local background image for the other device of the external image for the other device is the target region
  • an external image update processing unit for the other device that performs an update process that is a process for reducing the sharpness on the target area of the external image for the other device, and the external for the other device after the update process.
  • the image is the other image table
  • An external image transmitting unit for other device that performs transmission processing to the device, an external image reception processing unit for receiving own external image that is transmitted from the other image display device, and the external image for own device A display unit that transmits a real image and is visible, an imaging unit that generates a real image showing the real image by imaging, and a visual recognition through the external image for the own device displayed on the display unit
  • a local background image for the own device that generates a local background image for the own device indicating a superimposed portion of the real image to be generated from the real image, and the local background image for the own device is transmitted to the other image display device
  • a local background image transmission processing unit for the own device to be processed A local background image transmission processing unit for the own device to be processed.
  • a plurality of image display devices can directly communicate images.
  • an image processing apparatus that is an intermediary apparatus can be eliminated.
  • the external device external image generation unit may generate the external device external image from the real image generated by the imaging unit.
  • the image compression method targets a first step of comparing an external image and a local background image for each corresponding position, and an area determined to be similar to the local background image of the external image by the first step.
  • a second step of setting as a region a third step of performing an update process that is a process of reducing the sharpness on the target region set by the second step of the external image, and a step after the update process
  • a fourth step of encoding the external image targets a first step of comparing an external image and a local background image for each corresponding position, and an area determined to be similar to the local background image of the external image by the first step.
  • the present invention it is possible to reduce the information amount of the external image by reducing the sharpness of the region similar to the local background image in the external image.
  • Such a similar region is a region that is difficult for an observer to recognize when an external image is displayed. Therefore, even if the amount of information is reduced, it is possible to make it difficult for the observer to perceive deterioration in image quality.
  • FIG. 4 is a schematic diagram showing a state where DCT coefficients after quantization in one macroblock are arranged in a matrix.
  • FIG. 4 is a schematic diagram showing a state where DCT coefficients after quantization in one macroblock are arranged in a matrix.
  • FIG. 4 is a schematic diagram showing a state where DCT coefficients after quantization in one macroblock are arranged in a matrix.
  • FIG. 4 is a schematic diagram showing a state where DCT coefficients after quantization in one macroblock are arranged in a matrix.
  • FIG. 4 is a schematic diagram showing a local background image and an external image showing an outline of the update process.
  • FIG. 1 is a perspective view showing an example of the configuration of an image display device (see-through type HMD) according to an embodiment of the present invention.
  • the image display apparatus 1 of this example is based on glasses.
  • the image display device 1 generates a real image by imaging the frame 2, a display unit 3 that displays an external image, a right-eye lens 31 that is a part of the display unit 3, and a left-eye lens 4.
  • An imaging unit 5, a main body unit 6 that communicates with an external image processing apparatus (not shown), and a cable 7 that connects the display unit 3 and the imaging unit 5 to the main body unit 6 are provided.
  • the main unit 6 supplies an external image and power to the display unit 3 through the cable 7 and acquires a real image generated by the imaging unit 5. Further, the main body 6 transmits a real image to the image processing apparatus and receives an external image from the image processing apparatus.
  • the display unit 3 includes the right-eye lens 31 and a projection unit 32 that projects an optical image based on an external image input via the cable 7 onto the right-eye lens 31.
  • the projection unit 32 is provided on the upper part of the right-eye lens 31 and includes a light source (not shown) and a display element (not shown) for projecting an optical image of an external image.
  • the light source may be constituted by an LED (Light Emitting Diode) or an LD (Laser Diode).
  • the display element may be configured by a liquid crystal panel that selectively transmits light emitted from the light source and projects an optical image.
  • FIG. 2 is a cross-sectional view illustrating an example of the display operation of the display unit.
  • the optical image of the external image projected from the projection unit 32 is incident on the right eye lens 31 from the upper end thereof.
  • the incident optical image proceeds to the lower part while being totally reflected in the right-eye lens 31, and is reflected by the holographic optical element HOE provided at the lower part of the right-eye lens 31.
  • the reflected optical image enters the eyes of the observer, so that the observer visually recognizes the external image.
  • the real image that has passed through the right-eye lens 13 is also incident on the eyes of the observer and is visually recognized by the observer. Therefore, the observer can visually recognize the real image and the external image at the same time. In other words, the observer can visually recognize a real image on which an external image is superimposed.
  • the imaging unit 5 is provided integrally with the projection unit 32 of the display unit 3, and is in a direction visually recognized by the observer (the direction opposite to the direction in which the observer of the right-eye lens 31 exists). Direction) is taken. As described above, the real image captured and generated by the imaging unit 5 is input to the main body unit 6 via the cable 7.
  • the imaging unit 5 may include a solid-state imaging device (for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS)).
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the lens (lens 4 for the left eye) that does not constitute the display unit 3 may be configured not to be provided in the image display device 1 or may be provided as a dummy lens as in this example.
  • the configuration in which the projection unit 32 is provided only in the one-eye lens (the right-eye lens 31) is shown the configuration may be provided in the binocular lens.
  • the imaging unit 5 may be configured to be provided in a binocular lens.
  • the configuration including the main body 6 and the cable 7 has been described, the configuration of the main body 6 and the cable 7 may be omitted by providing the function of the main body 6 inside the projection unit 32.
  • an external image acquired by the main body unit 6 from the image processing apparatus for example, a display image of a computer or a mobile phone, a moving image captured by a web camera, or the like can be applied.
  • a real image captured by another image display device (HMD) may be used as an external image.
  • FIG. 3 is a block diagram showing the configuration of the first example of the image display system according to the embodiment of the present invention. Parts that are the same as those in FIG. 1 are denoted by the same reference numerals, and detailed description thereof is omitted.
  • the image display system S of the present example includes the image display device 1 and the image processing device 20 described above. As described above, the image display device 1 transmits a real image to the image processing device. Further, the image processing device 20 transmits an external image to the image display device 1.
  • the image communication between the image display device 1 and the image processing device 20 may be wireless communication using a wireless local area network (LAN) or infrared rays, or wired network communication using a wired LAN.
  • LAN wireless local area network
  • infrared rays or wired network communication using a wired LAN.
  • the image display device 1 includes an imaging unit 5 that captures a real image to generate and output a real image, and a real image encoding unit that encodes the real image output from the imaging unit 5 to generate and output a real image signal. 11, a real image transmission unit 12 that transmits a real image signal output from the real image encoding unit 11 to the image processing device 20, and an external image reception unit 13 that receives an external image signal transmitted from the image processing device 20.
  • An external image decoding unit 14 that decodes an external image signal received by the external image receiving unit 13 to generate and output an external image, and a display unit that displays the external image output from the external image decoding unit 14 3.
  • the image processing device 20 generates a real image by decoding the real image signal received by the real image receiving unit 21 and the real image receiving unit 21 that receives the real image signal transmitted from the image display device 1.
  • a real image decoding unit 22 that outputs, a local background image generation unit 23 that generates and outputs a local background image based on the real image output from the real image decoding unit 22, and a local background image generation unit 23 that outputs the local background image.
  • An external image update processing unit 24 that performs an update process on an external image input from the outside based on the local background image and outputs an external image after the update process, and an external image output from the external image update process unit 24
  • An external image encoding unit 25 that generates and outputs an external image signal by encoding
  • an external image transmission unit 26 that transmits the external image signal output from the external image encoding unit 25 to the image display device 1. Equipped with a.
  • FIG. 4 is a flowchart showing the operation of the image display system of the first embodiment.
  • FIG. 4 shows a case where an external image corresponding to one real image is displayed.
  • a real image is continuously captured and 00 external images are displayed, a series of operations shown in FIG. Shall be repeated.
  • the imaging unit 5 first captures a real image and generates a real image (STEP 1).
  • the real image encoding unit 11 compresses and encodes the real image to generate a real image signal (STEP 2).
  • DCT discrete cosine transform
  • JPEG Joint Photographic Experts Group
  • Motion-JPEG Motion-JPEG
  • MPEG Motion Picture Experts Group
  • a spatial domain signal expressed by a pixel value for each position (pixel) is converted into a frequency domain signal expressed by a signal value (DCT coefficient) for each frequency component.
  • the image may be divided into a plurality of blocks (hereinafter referred to as macroblocks), and DCT may be performed for each macroblock.
  • the obtained DCT coefficient is compressed by being quantized.
  • the quantization process for example, it is possible to apply a process of dividing each of the DCT coefficients by a quantization constant that is a predetermined value and rounding off fractions after the decimal point.
  • the quantization constant for dividing the DCT coefficient of the low frequency component may be set small, and the quantization constant for dividing the DCT coefficient of the high frequency component may be set large.
  • the DCT coefficient obtained by the above processing becomes a signal sequence by performing zigzag scanning, for example.
  • the zigzag scan is shown in FIG.
  • FIG. 5 is a schematic diagram showing a state where DCT coefficients after quantization in one macroblock are arranged in a matrix.
  • 8 ⁇ 8 DCT coefficients obtained by performing DCT on an 8 ⁇ 8 pixel macroblock are shown.
  • the DCT coefficient on the right indicates a component with a higher vertical frequency
  • the lower DCT coefficient indicates a component with a higher horizontal frequency. That is, the frequency component indicated by the DCT coefficient at the upper left is the smallest (the component having a frequency of 0, that is, a direct current component), and the frequency component indicated by the DCT coefficient at the lower right is the largest.
  • a signal sequence is generated by scanning while moving back and forth in an oblique direction from the upper left DCT coefficient Ddc to the lower right DCT coefficient D63. That is, a signal sequence in which DCT coefficients are arranged in the order of Ddc, D1, D2, D3, D4, D5,..., D61, D62, and D63 is generated.
  • the obtained signal sequence is encoded by, for example, run length encoding. This is a coding in which when the value X of the DCT coefficient appears m times continuously in the signal sequence, the portion is collectively abbreviated as mX.
  • the above encoding is performed for each component of the pixel value of the real image.
  • encoding is performed for each of Y (luminance component), C b (blue color difference component), and C r (red color difference component).
  • Y luminance component
  • C b blue color difference component
  • C r red color difference component
  • each component is similarly processed. Is encoded.
  • the real image signal generated in STEP 2 is transmitted from the real image transmission unit 12 to the image processing device 20 (STEP 3).
  • the transmitted real image signal is received by the real image receiving unit 21 of the image processing apparatus 20 (STEP 4).
  • the real image signal received in STEP 4 is decoded by the real image decoding unit 22 to generate a real image (STEP 5).
  • processing reverse to the above-described encoding is performed. For example, decoding of an encoded signal sequence, writing of a signal sequence into a matrix, inverse quantization (for example, processing for multiplying each DTC coefficient by the above quantization constant), and inverse discrete cosine transform, A real image is generated in order.
  • decoding is performed, for example, every component of the YC b C r.
  • FIG. 6 is a cross-sectional view showing the fields of view (view angles) of the imaging unit and the observer, and shows the same cross section as FIG.
  • the imaging unit 5 and the position of the observer's eyes are slightly different. Therefore, the actual image obtained by the imaging unit 5 is different from the actual image that the observer is actually viewing. Therefore, in STEP 6, the real image is converted to generate a local background image that is an image showing the real image visually recognized by the observer. In particular, it is assumed that an image showing a real image visually recognized through an external image displayed on the display unit 3 by the observer becomes a local background image.
  • the conversion process for example, a process of deleting an unnecessary part from the real image or a process of converting the coordinates of the pixels in the real image is performed.
  • the technique disclosed in Japanese Patent Application Laid-Open No. 2003-91720 may be used for the conversion to the local background image. This technique generates an image viewed from the observer's viewpoint by virtually converting the viewpoint of the imaging unit 5 (specifically, converting the coordinates of the real image).
  • FIG. 7 is a schematic diagram of a local background image and an external image showing an outline of the update process
  • FIG. 8 is a flowchart showing an outline of the update process.
  • Update processing is executed for the target area set in the external image by STEP 72 (STEP 73).
  • the update process is a process for reducing the sharpness of an image, for example, a process for smoothing pixel values of pixels in a target area or a process for reducing resolution.
  • the update process is a smoothing process. Details of the operations of STEP 71 to STEP 73 will be described later.
  • the updated external image generated in STEP 7 is encoded by the external image encoding unit 25, and an external image signal is generated (STEP 8).
  • an external image signal is generated (STEP 8).
  • the target region that has been updated in STEP 7 has a high-frequency component reduced by the smoothing process. Therefore, when DCT is performed on the macroblock including the target region and the obtained DCT coefficients are quantized, the DCT coefficients of the high-frequency components (that is, the DCT coefficients near the lower right in FIG. 5) are likely to be zero. . Then, in the second half of the signal sequence generated by the zigzag scan, 0 is continuous. At this time, if run-length encoding is performed, it is possible to abbreviate consecutive zeros in the latter half. That is, the amount of information of the external image signal can be reduced.
  • the external image signal generated in STEP 8 is transmitted from the external image transmission unit 26 to the image display device 1 (STEP 9).
  • the transmitted external image signal is received by the external image receiving unit 13 of the image display device 1 (STEP 10).
  • the external image signal received in STEP 10 is decoded by the external image decoding unit 14 to generate an external image (STEP 11).
  • an external image STEP 11
  • processing reverse to the encoding is performed.
  • the obtained external image is displayed to the observer by the display unit 3.
  • the external image is displayed superimposed on the real image visually recognized by the observer.
  • the target region similar to the real image has a reduced high-frequency component (definition).
  • a target area is an area similar to the superimposed real image, it is difficult for the observer to accurately recognize details (for example, fine shades of color). Therefore, even if the high-frequency component in the target region is reduced, the image quality is hardly deteriorated by the observer. Therefore, it is possible to reduce the transmission amount without causing the observer to feel the deterioration of the image quality of the external image.
  • the above-described encoding method is an encoding method in which abbreviated notation is performed on a signal sequence composed of frequency domain signals. Specifically, it is an encoding method in which consecutive signals having the same value are abbreviated for a signal sequence in which signal values are arranged in order of the magnitude of frequency components.
  • run-length encoding has been exemplified as the encoding method
  • the encoding method of the present invention is not limited to this.
  • a so-called zero-run length coding that omits only the signal value 0 may be used, and when 0 continues to the end of the signal sequence (that is, when the signal values of components above a certain frequency are all 0) ),
  • An encoding method that outputs an EOB (End Of Block) code and abbreviates it may be used.
  • the real image output from the real image decoding unit 22 of the image processing device 20 is used by another person (for example, a person who works together with an observer who uses the image display device 1).
  • the image processing apparatus 20 may be configured to output a real image in order to be confirmed or recorded.
  • FIG. 9 is a block diagram showing the configuration of the second example of the image display system according to the embodiment of the present invention, and corresponds to FIG. 3 showing the first example.
  • the same number is attached
  • the image display system Sa of this embodiment includes an image display device 1a and an image processing device 20a, as in the first embodiment. However, the point that the image display device 1a is provided with the local background image generation unit 23a is different from the first embodiment.
  • the image display device 1a includes an imaging unit 5, a local background image generation unit 23a that generates and outputs a local background image based on a real image output from the imaging unit 5, and a local output that is output from the local background image generation unit 23a.
  • a local background image encoding unit 11a that encodes a background image to generate and output a local background image signal, and a local background image that transmits a local background image signal output from the local background image encoding unit 11a to the image processing device 20a
  • a transmission unit 12a, an external image reception unit 13, an external image decoding unit 14, and a display unit 3 are provided.
  • the image processing device 20a decodes the local background image signal received by the local background image receiving unit 21a that receives the local background image signal transmitted from the image display device 1a and the local background image receiving unit 21a, and performs local processing.
  • a local background image decoding unit 22a that generates and outputs a background image, an external image update processing unit 24, an external image encoding unit 25, and an external image transmission unit 26 are provided.
  • FIG. 10 is a flowchart showing the operation of the image display system of the second embodiment, and corresponds to FIG. 4 showing the operation of the image display system of the first embodiment. It should be noted that the same STEP numbers are assigned to parts that operate in the same manner as in FIG. 4, and detailed descriptions thereof are omitted.
  • the imaging unit 5 captures a real image and generates a real image (STEP 1).
  • the local background image generation unit 23a then generates a local background image from the real image output from the imaging unit 5 (STEP 6a).
  • the local background image generation method is the same as the generation method by the local background image generation unit 23 of the first embodiment.
  • the generated local background image is encoded by the local background image encoding unit 11a to generate a local background image signal (STEP 2a).
  • the generated local background image signal is transmitted from the local background image transmission unit 12a to the image processing device 20a (STEP 3a). Note that the encoding method by the local background image encoding unit 11a is the same as the encoding method by the real image encoding unit 11 of the first embodiment.
  • the local background image signal transmitted in STEP 3a is received by the local background image receiving unit 21a of the image processing device 20a (STEP 4a).
  • the received local background image signal is decoded by the local background image decoding unit 22a to generate a local background image (STEP 5a). Note that the decoding method by the local background image decoding unit 22a is the same as the decoding method by the real image decoding unit 22 of the first embodiment.
  • the following operations are the external image update processing by the external image update processing unit 24 (STEP 7), the external image encoding after the update processing by the external image encoding unit 25 (STEP 8), and the external image transmission unit 26.
  • External image signal transmission by the external image signal (STEP 9), external image signal reception by the external image reception unit (STEP 10), external image decoding by the external image decoding unit (STEP 11), and display of the external image by the display unit 3 (STEP 12) And are the same as those in the first embodiment.
  • a local background image is further generated in the image display device 1a and is transmitted to the image processing device 20a instead of a real image. That is, a local background image having a smaller amount of information than a real image is communicated. Therefore, the transmission amount between the image display device 1a and the image processing device 20a can be further reduced as compared with the first embodiment.
  • the local background image output from the local background image decoding unit 22a of the image processing device 20a is illustrated as a configuration that is not output from the image processing device 20a, but may be configured to be output.
  • the second embodiment can be applied.
  • FIG. 11 is a block diagram showing the configuration of the third example of the image display system according to the embodiment of the present invention, and corresponds to FIG. 3 showing the first example.
  • the same number is attached
  • the image display system Sb of the present embodiment includes a plurality of image display devices that have both the functions of the image display device and the functions of the image processing device shown in the first and second embodiments.
  • a plurality of image display devices having the same configuration communicate directly.
  • first image display apparatus 100 an image display apparatus that will be mainly described
  • second image display apparatus 101 an image display apparatus that communicates with the first image display apparatus 100
  • first attached to the name indicates a display operation of the first image display apparatus 100
  • second attached to the second image display apparatus 101. The display operation will be shown.
  • the first image display apparatus 100 generates and outputs a first local background image based on the imaging unit 5 that generates and outputs a first reality image by imaging, and the first reality image output from the imaging unit 5.
  • the first local background image transmission unit 12b that transmits the first local background image signal output from the first local background image encoding unit 11b to the second image display device 101, and the second local image that generates and outputs the second external image.
  • 2 Local background image A second local background image decoding unit 22b that generates and outputs a second local background image, and a second external image based on the second local background image output from the second local background image decoding unit 22b
  • the second external image update processing unit 24b that performs the update process on the second external image output from the generation unit 31 and outputs the second external image after the update process, and the second external image update processing unit 24b output the second external image.
  • a second external image encoding unit 25b that encodes the second external image to generate and output a second external image signal, and a second external image signal output from the second external image encoding unit 25b is displayed as a second image.
  • the second external image transmission unit 26b that transmits to the device 101 the first external image reception unit 13b that receives the first external image signal transmitted from the second image display device 101, and the first external image reception unit 13b
  • First external image A first external image decoding unit 14b for decoding and generating a first external image, and a display unit 3 for displaying the first external image output from the first external image decoding unit 14b.
  • FIGS. 12 and 13 are flowcharts showing the operation of the image display system of the third embodiment, and correspond to FIG. 4 showing the operation of the image display system of the first embodiment. It should be noted that the same STEP numbers are assigned to parts that operate in the same manner as in FIG. 4, and detailed descriptions thereof are omitted.
  • FIG. 12 is a flowchart for the display operation of the first image display device 100 performed by the first image display device 100.
  • FIG. 13 is a flowchart for the display operation of the second image display device 101 performed by the first image display device 100.
  • the imaging unit 5 captures a real image and generates a first real image (STEP 1b).
  • the local background image generation unit 23b generates a first local background image from the first real image output from the imaging unit 5 (STEP 6b).
  • the first local background image generation method is the same as the generation method by the local background image generation unit 23 of the first embodiment.
  • the generated first local background image is encoded by the first local background image encoding unit 11b to generate a first local background image signal (STEP 2b).
  • the generated first local background image signal is transmitted from the first local background image transmission unit 12b to the second image display device 101 (STEP 3b).
  • the encoding method by the local background image encoding unit 11b is the same as the encoding method by the real image encoding unit 11 of the first embodiment.
  • the second image display device 101 generates a first external image to be displayed on the first image display device (STEP ⁇ ). Specifically, the second image display device 101 generates a first external image, updates the first external image using the first local background image transmitted from the first image display device 100, and executes this. The operation of transmitting to the first image display device 100 is performed. Note that the operation of STEP ⁇ of the second image display apparatus 101 is the same as the operation of the first image display apparatus 100 described later with reference to FIG.
  • the signal transmitted from the second image display device 101 is received by the first external image receiving unit 13b of the first image display device 100 (STEP 10b).
  • the generated first external image signal is decoded by the first external image decoding unit 14b to generate a first external image (STEP 11b).
  • the first external image is displayed on the display unit 3 (STEP 12).
  • the decoding method by the first external image decoding unit 14b is the same as the decoding method by the external image decoding unit 14 of the first embodiment.
  • the first image display device 101 also performs the operation shown in FIG. 13 (that is, the operation for the display operation of the second image display device 101) simultaneously with the operation of FIG.
  • the second local background image signal transmitted from the second image display device 101 is received by the second local background image receiving unit 21b (STEP 4b).
  • the received second local background image signal is decoded by the second local background image decoding unit 22b to generate a second local background image (STEP 5b).
  • the decoding method by the second local background image decoding unit 22b is the same as the decoding method by the real image decoding unit 22 of the first embodiment.
  • the second external image generation unit 31 generates a second external image to be displayed on the second image display device 101 (STEP ⁇ ).
  • the second external image may be generated using part or all of the first real image captured by the imaging unit 5.
  • the second external image may be generated using an image captured by an image capturing unit other than the image capturing unit 5, or an image recorded in a memory (not shown) or the first image display device 100 may be externally provided.
  • the second external image may be generated by using the image input from.
  • the timing for generating and outputting the second external image is not limited to the example in FIG. 13, and it may be generated and output at any timing as long as it is before STEP 7 b described later.
  • the external image update processing unit 24b performs update processing of the second external image generated in STEP ⁇ based on the second local background image generated in STEP 5b (STEP 7b).
  • the update process by the second external image update processing unit 24b is the same as the update process of the external image update processing unit 24 of the first embodiment.
  • the second external image after the update processing generated by the second external image update processing unit 24b is encoded by the second external image encoding unit 25b, and a second external image signal is generated (STEP 8b).
  • the second external image signal generated in STEP 8b is transmitted from the second external image transmission unit 26b to the second image display device 101 (STEP 9b).
  • the operation of STEP ⁇ by the second image display device 101 shown in FIG. 12 is the same as the operation (STEP 4b, 5b, ⁇ , 7b to 9b) of the first image display device 100 shown in FIG. Note that the operation of the second image display device 101 (that is, STEP ⁇ ) is indicated by replacing “first” and “second” as described above.
  • the same effects as in the first embodiment can be obtained. That is, it is possible to reduce the transmission amount without causing the observer to strongly feel the deterioration of the image quality of the external image. In particular, when the amount of information of the first external image and the second external image is large and the amount of transmission increases when these images are directly transmitted and received, it is preferable to apply the configuration of this embodiment.
  • an image processing apparatus that is an intermediary apparatus.
  • an image processing apparatus that performs update processing and transmission / reception of external images is indispensable.
  • the image display apparatuses 100 and 101 can perform update processing and transmission / reception of external images, respectively, so that mutual display can be performed without requiring an image processing apparatus. It becomes possible.
  • the first image display device 100 generates the first local background image and transmits the first local background image to the second image display device 101 (that is, the same configuration as in the second embodiment).
  • the first actual image may be transmitted to the second image display device 101
  • the second local image may be generated by the second image display device 101 (that is, the same configuration as in the first embodiment).
  • there is no special circumstance that requires the first actual image on the second image display device 101 side such as when the second image display device 101 records the first actual image or transmits it to others.
  • the configuration shown in FIG. 11 is preferable.
  • the external image before the update process and the local background image are described as having the same size (number of pixels), but they may be different. If the sizes are different, the external image update processing unit converts the size of one of the images (for example, increasing the number of pixels by interpolation processing or reducing the number of images by thinning processing). It doesn't matter. If comprised in this way, since the pixel count of both the images compared at the time of an update process will become the same, it will become possible to compare easily. In this case, conversion may be performed so that the external image does not become larger than the display resolution of the display unit.
  • the image display device and the image processing device in the first and second embodiments, and the first image display device and the second image display device in the third embodiment may communicate with each other by wire or communicate by radio. It doesn't matter.
  • Update process >> Specific examples of update processing performed in the above-described image display system will be described below with reference to the drawings. In particular, the processing of STEP 71 and STEP 72 shown in FIG. 8 (image comparison and target area setting) and the processing of STEP 73 (execution of update processing) will be described.
  • FIG. 14 is a schematic diagram of a local background image and an external image shown for a first example of image comparison and target area setting.
  • the local background image and the external image are divided into the same number of blocks having a predetermined size.
  • the example shown in FIG. 14 shows a case where the local background image and the external image are each divided into (N + 1) ⁇ (M + 1) blocks (N and M are natural numbers). Further, the position of each block is expressed by the coordinates of [x, y], and the value of x increases toward the right, and the value of y increases toward the bottom. Also, the upper left block of the local background image is A1 [0,0], and the lower right block is A1 [N, M]. The same applies to the external image, where the upper left block is A2 [0,0] and the lower right block is A2 [N, M].
  • a block at a corresponding position (that is, a block where x and y are both equal) is compared. For example, as shown in FIG. 14, A1 [s, t] and A2 [s, t] are compared (where 0 ⁇ s ⁇ N and 0 ⁇ t ⁇ M).
  • comparison method for example, it is possible to use a method of determining a representative value of a block using an average value or a median value of pixels included in a block to be compared, and comparing the representative value. It is.
  • the representative value is calculated for each pixel value component such as YC r C b described above and compared. Then, when the representative values of all the components are similar (for example, when the difference value of each component is equal to or smaller than a predetermined size, or when the representative values after gradation reduction are substantially equal), the compared blocks Are determined to be similar. Then, the block of the external image is set as the target area.
  • the present invention is not limited to the pixel values having the components of YC r C b, it is possible to similarly compare process even pixel values with a component such as RGB or HSV.
  • the block is set to be small. About the size of a block, it can adjust suitably from such a viewpoint.
  • a component having a frequency of 0 may be used as a representative value. Since this component indicates the average pixel value in the block, it can be used as a representative value.
  • the processing unit 24b may acquire each DC component.
  • a DC component is acquired from the real image decoding unit 22 in FIG. 3, it is necessary to acquire a DC component relating to a block after conversion into a local background image.
  • the variance of the pixel values of each block of the external image may be calculated, and blocks having a variance greater than a predetermined value may be excluded from the target area.
  • a block having a large variance includes an edge portion (for example, a contour portion of an object and a pixel value that can change sharply). If the sharpness is reduced by performing an update process to be described later on such a portion, the edge portion may be blurred and the external image may become unclear. Therefore, it is possible to improve the image quality of the external image by excluding blocks with such a large variance from the target area. Furthermore, by performing this exclusion process prior to the comparison process, it is possible to suppress the unnecessary comparison process and simplify the process.
  • FIG. 15 is a schematic diagram of a local background image, an external image, and a superimposed image shown in a second example of image comparison and target area setting.
  • area division is performed based on the pixel value.
  • components of the respective pixel values of the local background image and an external image e.g., YC r C b, etc.
  • an external image e.g., YC r C b, etc.
  • a labeling process a process for forming a label by connecting adjacent pixels having substantially the same pixel value and attaching a label
  • a label corresponding to the pixel value of the pixel belonging to the region is attached.
  • a region of a certain label surrounds a minimum region of another label (for example, a region corresponding to several pixels)
  • processing for regarding the minimum region as a region of a certain label may be performed.
  • a threshold value division method may be used in which a threshold value of a pixel value is detected based on a histogram of pixel values, and a region is divided by determining whether the pixel values are substantially equal based on the threshold value.
  • a region expansion method may be used in which a neighboring pixel having a similar pixel value starts from a certain pixel and attempts to form a region.
  • various well-known region segmentation methods can be used. Even when these methods are used, it is preferable to reduce the gradation of the image in advance as described above.
  • the areas divided by the method as described above and the labels attached to the respective areas are displayed together.
  • the label G1 in the local region image and the label g1 in the external image indicate labels attached to regions including pixels with substantially equal pixel values.
  • G5 and g5 also indicate labels attached to regions including pixels having substantially the same pixel value.
  • an area where labels are substantially equal (for example, an area where G1 and g1 overlap or an area where G5 and g5 overlap) Set as.
  • the target area When the target area is set in this manner, an external image area similar to the local area image can be accurately set as the target area. In particular, it is possible to set a target region having a pixel unit boundary. Therefore, the quality of the external image displayed on the display unit 3 can be improved.
  • FIG. 16A and FIG. 16B are pixel value graphs showing a third example of image comparison and target area setting.
  • the method of this example is a method that is preferably applied as a determination process for determining whether or not each target region set by the methods of the first example and the second example described above is appropriate. is there.
  • FIG. 16 shows a case where the pixel value of each pixel of the image includes three types of components YC r C b .
  • the determination process of this example performs determination based on the distribution of pixel values of pixels provided in the target area of the external image and the corresponding area of the local background image. Specifically, as shown in FIG. 16A and FIG. 16B, after plotting each pixel in the region of both images based on the pixel value, a predetermined separation method is tried and a set of pixel values of the local background image And whether or not a set of pixel values of the external image is separable.
  • the graph shown in FIGS. 16A and 16B are three-dimensional graph axes are perpendicular each YC r C b, respectively show pixels in the local background image triangle, the pixel of the external image in cross .
  • FIG. 16A shows a case where separation is possible as a result of performing linear separation on a set of pixel values of the local background image and a set of pixel values of the external image by a predetermined method (the broken lines in the figure are , Straight line drawn by linear separation).
  • the target area of the external image and the corresponding area of the local background image are not similar. That is, when the sharpness of the target area of the external image is reduced and displayed on the display unit 3, it is easy for the observer to confirm the deterioration of the image quality. Therefore, such an external image area is excluded from the target area.
  • FIG. 16B shows a case where separation is impossible as a result of performing linear separation on a set of pixel values of the local background image and a set of pixel values of the external image by a predetermined method.
  • the target area of the external image and the corresponding area of the local background image are similar. That is, when the sharpness of the target area of the external image is reduced and displayed on the display unit 3, it is difficult for the observer to confirm the deterioration of the image quality. Therefore, such an external image area is not excluded from the symmetric area.
  • the determination of similarity between the local background image and the external image is performed using all types of components included in the pixel value of the pixel, and the target region is set.
  • the determination of similarity may be performed using a predetermined type of component.
  • the similarity determination may be performed using only the color component.
  • a smoothing process can be applied as the update process.
  • a filter process using an n ⁇ n average value filter for example, a filter whose filter coefficients are all 1 may be performed. Absent.
  • f (i, j) is a pixel value in the target area before the update process
  • g (i, j) is a pixel value in the target area after the update process.
  • [N / 2] represents a maximum integer not exceeding n / 2.
  • N is a natural number of 2 or more.
  • a median filter for example, a filter that outputs a median value of pixel values of a calculation target pixel and its surrounding pixels as a pixel value of a calculation target pixel after filtering
  • a Gaussian filter for example, The smoothing process may be performed using a filter whose filter coefficient is substantially equal to the Gaussian distribution.
  • Other known smoothing filters may be used.
  • update processing may be performed only on a specific component of the pixel value, or the processing content may be varied depending on the type of the pixel value.
  • the smoothing process is not performed on the luminance component Y that the human (observer) feels sensitively, or the degree of smoothing may be reduced. I do not care.
  • the color difference component C r C b (particularly the blue color difference component C b ) that humans feel insensitive is subjected to a smoothing process.
  • the degree of smoothing may be increased for these components. If comprised in this way, it will become possible to suppress degradation of the image quality which an observer feels efficiently.
  • the filter size may be changed as appropriate, or the applied filter and filter coefficient may be changed as appropriate.
  • the smoothing process is performed in order to reduce the amount of information after encoding has been mainly described, but the low resolution of the pixel value component (particularly, the color component) of the target region. Similarly, it is possible to reduce the amount of information of the encoded signal. Note that it is preferable to appropriately perform interpolation processing at the time of decoding or reproduction to return the number of pixel value components of the target area whose resolution has been reduced to the number before the resolution reduction.
  • HMD has been described as an example of the image display device, the present invention can be applied to, for example, a head-up display (HUD).
  • HUD head-up display
  • the present invention can be used for HMD and HUD.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention porte sur un appareil de traitement d'image qui comprend une unité de génération d'image d'arrière-plan local qui génère une image d'arrière-plan local à partir d'une image réelle captée transmise à partir d'un appareil d'affichage d'image ; et une unité de révision d'image externe qui règle une zone cible sur la base de l'image d'arrière-plan local et d'une image externe entrée depuis l'extérieur, et révise la zone cible de l'image externe par réduction de sa netteté.
PCT/JP2009/071202 2009-01-09 2009-12-21 Procédé de compression d'image, appareil de traitement d'image, appareil d'affichage d'image et système d'affichage d'image WO2010079682A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010513212A JP4573001B2 (ja) 2009-01-09 2009-12-21 画像圧縮方法、画像処理装置、画像表示装置及び画像表示システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-003365 2009-01-09
JP2009003365 2009-01-09

Publications (1)

Publication Number Publication Date
WO2010079682A1 true WO2010079682A1 (fr) 2010-07-15

Family

ID=42316451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/071202 WO2010079682A1 (fr) 2009-01-09 2009-12-21 Procédé de compression d'image, appareil de traitement d'image, appareil d'affichage d'image et système d'affichage d'image

Country Status (2)

Country Link
JP (1) JP4573001B2 (fr)
WO (1) WO2010079682A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011002753A (ja) * 2009-06-22 2011-01-06 Sony Corp 頭部装着型ディスプレイ、及び、頭部装着型ディスプレイにおける画像表示方法
JP2017005372A (ja) * 2015-06-05 2017-01-05 キヤノン株式会社 通信装置およびその制御方法
WO2023062996A1 (fr) * 2021-10-13 2023-04-20 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099253A (ja) * 1995-06-19 1997-01-10 Toshiba Corp 画像圧縮通信装置
JPH10112856A (ja) * 1996-10-04 1998-04-28 Agency Of Ind Science & Technol 画像伝送装置および方法
JP2008300983A (ja) * 2007-05-29 2008-12-11 Canon Inc 頭部装着型表示装置、及びその制御方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004056335A (ja) * 2002-07-18 2004-02-19 Sony Corp 情報処理装置および方法、表示装置および方法、並びにプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099253A (ja) * 1995-06-19 1997-01-10 Toshiba Corp 画像圧縮通信装置
JPH10112856A (ja) * 1996-10-04 1998-04-28 Agency Of Ind Science & Technol 画像伝送装置および方法
JP2008300983A (ja) * 2007-05-29 2008-12-11 Canon Inc 頭部装着型表示装置、及びその制御方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011002753A (ja) * 2009-06-22 2011-01-06 Sony Corp 頭部装着型ディスプレイ、及び、頭部装着型ディスプレイにおける画像表示方法
US9189829B2 (en) 2009-06-22 2015-11-17 Sony Corporation Head mounted display, and image displaying method in head mounted display
JP2017005372A (ja) * 2015-06-05 2017-01-05 キヤノン株式会社 通信装置およびその制御方法
WO2023062996A1 (fr) * 2021-10-13 2023-04-20 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Also Published As

Publication number Publication date
JP4573001B2 (ja) 2010-11-04
JPWO2010079682A1 (ja) 2012-06-21

Similar Documents

Publication Publication Date Title
TWI806854B (zh) 用於近眼顯示器之系統
US7483486B2 (en) Method and apparatus for encoding high dynamic range video
JP6178017B2 (ja) ステレオビデオのための深度認識向上
US8488870B2 (en) Multi-resolution, multi-window disparity estimation in 3D video processing
US11006113B2 (en) Image processing device, method, and program deciding a processing parameter
WO2012086120A1 (fr) Appareil de traitement d'image, appareil de captation d'image, procédé de traitement d'image et programme
KR100653965B1 (ko) 휴대용 단말기의 서로 다른 카메라를 이용한 3차원입체영상 처리 장치
US8014611B2 (en) Image compression method, image compression device, image transmission system, data compression pre-processing apparatus, and computer program
EP3635952B1 (fr) Compression d'un flux de contenu numérique
CN109643456A (zh) 图像压缩方法与设备
WO2018116604A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme
WO2018116603A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme
KR20190010296A (ko) 전자 장치 및 전자 장치에서 이미지 데이터를 압축하는 방법
JP2019106572A (ja) 画像符号化装置及びその制御方法、並びにプログラム
US8433132B2 (en) Method for efficient representation and processing of color pixel data in digital pathology images
JP4573001B2 (ja) 画像圧縮方法、画像処理装置、画像表示装置及び画像表示システム
JP7512492B2 (ja) 品質最適化デブロッキングを実行するためのイメージ処理デバイス及び方法
US20110135199A1 (en) Coding apparatus and method for simultaneous transmission of image processing information and color information
JP2005101720A (ja) 部分画像符号化装置
WO2023093768A1 (fr) Procédé et appareil de traitement d'image
CN115272440A (zh) 一种图像处理方法、设备及系统
JP6341598B2 (ja) 画像符号化装置、画像復号装置、画像符号化プログラム及び画像復号プログラム
KR101609394B1 (ko) 입체 영상 인코딩 장치 및 방법
US11233999B2 (en) Transmission of a reverse video feed
WO2011158562A1 (fr) Dispositif de codage d'images de points de vue multiples

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2010513212

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09837567

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09837567

Country of ref document: EP

Kind code of ref document: A1