WO2010100985A1 - Image processing apparatus and image processing method - Google Patents
Image processing apparatus and image processing method Download PDFInfo
- Publication number
- WO2010100985A1 WO2010100985A1 PCT/JP2010/051310 JP2010051310W WO2010100985A1 WO 2010100985 A1 WO2010100985 A1 WO 2010100985A1 JP 2010051310 W JP2010051310 W JP 2010051310W WO 2010100985 A1 WO2010100985 A1 WO 2010100985A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- frame
- output
- frame image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/21—Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
Definitions
- the present invention relates to an image display technique.
- the intermittent pattern will be supplementally explained.
- the photographing time of one image by a video camera is 1/60 sec.
- the shutter is not kept open for 1/60 sec. Instead, the time of 1/60 sec is divided, for example, into 1,000, and photographing is done by opening and closing the shutter for every divided short time.
- the intermittent pattern can be expressed by a binary number of 1,000 digits.
- the present invention has been made to overcome the conventional drawbacks, and provides a technique for displaying a suitable image even when displaying, as a movie display image or still image display image, a frame image which forms a movie.
- an image processing apparatus comprising: a unit which inputs a movie formed from a plurality of frame images; a unit which acquires an instruction representing that the frame image is to be displayed as an image in a movie or the frame image is to be displayed as a still image; a first output unit which, when the acquired instruction represents that the frame image is to be displayed as an image in a movie, blurs an outline in the frame image to update the frame image and output the updated frame image as a movie display image; and a second output unit which, when the acquired instruction represents that the frame image is to be displayed as a still image, removes a motion blur from the frame image to update the frame image and output the updated frame image as a still image display image.
- an image processing apparatus comprising: a unit which acquires a stream generated by an apparatus including an input unit which inputs a movie formed from a plurality of frame images, a blurring unit which blurs outlines in the respective frame images input by the input unit, a first encode unit which encodes the respective frame images obtained by the blurring unit, a decode unit which decodes the respective frame images encoded by the first encode unit, a unit which obtains difference images between corresponding frame images among the respective frame images decoded by the decode unit and the respective frame images input by the input unit, a second encode unit which encodes the respective difference images, and a unit which generates the stream containing an encoding result of the first encode unit and an encoding result of the second encode unit; a first decode unit which decodes the encoding result of the first encode unit contained in the stream; a second decode unit which decodes the encoding result of the second encode unit contained in the stream; a unit which acquires a stream generated by an apparatus including an input unit which inputs
- an image processing method using a computer to perform the steps of: an input step of inputting a movie formed from a plurality of frame images; a step of acquiring an instruction representing that the frame image is to be displayed as an image in a movie or the frame image is to be displayed as a still image; a first output step of, when the acquired instruction represents that the frame image is to be displayed as an image in a movie, blurring an outline in the frame image to update the frame image and output the updated frame image as a movie display image; and a second output step of, when the acquired instruction represents that the frame image is to be displayed as a still image, removing a motion blur from the frame image to update the frame image and output the updated frame image as a still image display image.
- an image processing method using a computer to perform the steps of: a step of acquiring a stream generated by an apparatus including an input unit which inputs a movie formed from a plurality of frame images, a blurring unit which blurs outlines in the respective frame images input by the input unit, a first encode unit which encodes the respective frame images obtained by the blurring unit, a decode unit which decodes the respective frame images encoded by the first encode unit, a unit which obtains difference images between corresponding frame images among the respective frame images decoded by the decode unit and the respective frame images input by the input unit, a second encode unit which encodes the respective difference images, and a unit which generates the stream containing an encoding result of the first encode unit and an encoding result of the second encode unit; a first decode step of decoding the encoding result of the first encode unit contained in the stream; a second decode step of decoding the encoding result of the second encode unit contained in the
- FIG. 1 is a block diagram exemplifying the functional arrangement of an image processing apparatus according to the first embodiment
- FIG. 2 is a block diagram exemplifying the functional arrangement of an encoding apparatus serving as an image processing apparatus according to the second embodiment
- FIG. 3 is a block diagram exemplifying the functional arrangement of a decoding apparatus serving as an image processing apparatus according to the second embodiment
- Fig. 4 is a view exemplifying the arrangement of a Laplacian filter kernel
- Fig. 5 is a view exemplifying the arrangement of a lowpass filter kernel
- Fig. 6 is a flowchart showing details of processing in step S705;
- Fig. 7 is a flowchart of processing performed by the image processing apparatus according to the first embodiment
- FIG. 8 is a flowchart of processing performed by the encoding apparatus serving as an image processing apparatus according to the second embodiment
- Fig. 9 is a flowchart of processing performed by the decoding apparatus serving as an image processing apparatus according to the second embodiment.
- Fig. 10 is a block diagram exemplifying the hardware configuration of a computer applicable to an image processing apparatus (including encoding and decoding apparatuses) in each embodiment.
- Fig. 1 is a block diagram exemplifying the functional arrangement of the image processing apparatus according to the first embodiment.
- the image processing apparatus according to the first embodiment includes a data input unit 101, demultiplexing unit 102, stream decode unit 103, selector 104, motion blur removal unit 105, multiple outline removal unit 106, selector 107, image output unit 108, and switching signal input unit 109.
- the data input unit 101 externally acquires an H.264 stream which is movie data encoded according to ITU-T H.264 (ISO/IEC 14496-10) .
- the data input unit 101 sends the acquired H.264 stream to the subsequent demultiplexing unit 102.
- photographing information is added (multiplexed) to the H.264 stream.
- the photographing information is information containing motion information measured by a gyro sensor mounted in a camera which has recoded the movie, and the above- mentioned intermittent pattern.
- the demultiplexing unit 102 Upon receiving the H.264 stream multiplexed with the photographing information from the data input unit 101, the demultiplexing unit 102 demultiplexes it into the H.264 stream and photographing information. The demultiplexing unit 102 sends the H.264 stream to the subsequent stream decode unit 103 and the photographing information to the motion blur removal unit 105.
- the stream decode unit 103 Upon receiving the H.264 stream from the demultiplexing unit 102, the stream decode unit 103 decodes the images (frame images) of respective frames contained in the H.264 stream.
- the stream decode unit 103 decodes the images (frame images) of respective frames contained in the H.264 stream.
- the switching signal input unit 109 receives an instruction representing which of a movie display image and still image display image is to be displayed as the frame image decoded by the stream decode unit 103. Upon receiving this instruction, the switching signal input unit 109 controls the selectors
- the switching signal input unit 109 controls the selector 104 to input a frame image output from the stream decode unit 103 to the multiple outline removal unit 106. Also, the switching signal input unit 109 controls the selector 107 to input an output from the multiple outline removal unit 106 to the image output unit 108. That is, when the input instruction instructs to display the frame image as a movie display- image, the switching signal input unit 109 controls the selectors 104 and 107 to send a frame image output from the stream decode unit 103 to the image output unit 108 via the multiple outline removal unit 106.
- the switching signal input unit 109 controls the selector 104 to input a frame image output from the stream decode unit 103 to the motion blur removal unit 105. Further, the switching signal input unit 109 controls the selector 107 to input an output from the motion blur removal unit 105 to the image output unit 108. That is, when the input instruction instructs to display the frame image as a still image display image, the switching signal input unit 109 controls the selectors 104 and 107 to send a frame image output from the stream decode unit 103 to the image output unit 108 via the motion blur removal unit 105.
- the multiple outline removal unit 106 Upon receiving a frame image from the selector 104, the multiple outline removal unit 106 first detects an outline in the frame image by applying a Laplacian filter kernel having an arrangement exemplified in Fig. 4 to each pixel which forms the frame image. Then, the multiple outline removal unit 106 blurs the outline (blur processing) by applying a lowpass filter kernel having an arrangement exemplified in Fig. 5 to each pixel which forms the detected outline. Accordingly, the multiple outline removal unit 106 updates the frame image received from the selector 104 to a frame image having a blurred outline. The multiple outline removal unit 106 sends the updated frame image (frame image having a blurred outline) to the subsequent selector 107.
- a Laplacian filter kernel having an arrangement exemplified in Fig. 4
- the multiple outline removal unit 106 blurs the outline (blur processing) by applying a lowpass filter kernel having an arrangement exemplified in Fig. 5 to each pixel which forms the detected
- the motion blur removal unit 105 Upon receiving a frame image from the selector 104, the motion blur removal unit 105 updates it by removing a motion blur from the frame image by using motion information and an intermittent pattern that are contained in photographing information received from the demultiplexing unit 102. As described above, the processing of removing a motion blur from an image using motion information and an intermittent pattern is a well-known technique, and a description thereof will be omitted. [0036]
- the first embodiment assumes that motion information is contained in advance in photographing information and multiplexed in an H.264 stream. However, the motion information can be dynamically obtained by calculating a motion vector using each frame image. Thus, the motion information need not always be contained in advance in photographing information.
- the motion blur removal unit 105 sends the updated frame image (motion blur-removed frame image) to the subsequent selector 107.
- the selector 107 transfers, to the image output unit 108, an output (frame image) selected by the switching signal input unit 109 from those from the motion blur removal unit 105 and multiple outline removal unit 106.
- the image output unit 108 outputs the frame image received from the selector 107.
- the output destination is not particularly limited, and may be a display device formed from a CRT or liquid crystal display, or a storage device.
- Fig. 7 is a flowchart of processing performed by the image processing apparatus according to the first embodiment.
- the data input unit 101 acquires an H.264 stream multiplexed with photographing information.
- step S702 the demultiplexing unit 102 receives the H.264 stream multiplexed with photographing information from the data input unit 101, and demultiplexes it into the H.264 stream and photographing information.
- the demultiplexing unit 102 sends the H.264 stream to the subsequent stream decode unit 103 and the photographing information to the subsequent motion blur removal unit 105.
- step S703 upon receiving the H.264 stream from the demultiplexing unit 102, the stream decode unit 103 decodes the images (frame images) of respective frames contained in the H.264 stream.
- the stream decode unit 103 sequentially sends the decoded frame images of the respective frames to the subsequent selector 104.
- the switching signal input unit 109 controls the selector 104 to input a frame image output from the stream decode unit 103 to the multiple outline removal unit 106. Also, the switching signal input unit 109 controls the selector 107 to input an output from the multiple outline removal unit 106 to the image output unit 108. In this case, the process advances to step S705 via step S704.
- the switching signal input unit 109 controls the selector 104 to input a frame image output from the stream decode unit 103 to the motion blur removal unit 105.
- the switching signal input unit 109 controls the selector 107 to input an output from the motion blur removal unit 105 to the image output unit 108. In this case, the process advances to step S706 via step S704.
- step S705 upon receiving a frame image from the selector 104, the multiple outline removal unit 106 first detects an outline in the frame image by- applying a Laplacian filter kernel having the arrangement exemplified in Fig. 4 to each pixel which forms the frame image. Then, the multiple outline removal unit 106 blurs the outline by applying a lowpass filter kernel having the arrangement exemplified in Fig. 5 to each pixel which forms the detected outline. In this way, the multiple outline removal unit 106 updates the frame image received from the selector 104 to a frame image having a blurred outline .
- Fig. 6 is a flowchart showing details of the processing in step S705.
- the multiple outline removal unit 106 detects an outline in a frame image input from the selector 104 by applying a Laplacian filter kernel having the arrangement exemplified in Fig. 4 to each pixel which forms the frame image. If a pixel having a pixel value larger than a predetermined value exists in pixels to which the Laplacian filter kernel has been applied, the outline removal unit 106 advances the process to step S603 via step S602.
- the multiple outline removal unit 106 blurs the outline by applying a lowpass filter kernel having the arrangement exemplified in Fig. 5 to the pixel.
- the multiple outline removal unit 106 sends the updated frame image (frame image having a blurred outline) to the subsequent selector 107.
- step S706 upon receiving a frame image from the selector 104, the motion blur removal unit 105 updates it by removing a motion blur from the frame image by using motion information and an intermittent pattern that are contained in the photographing information received from the demultiplexing unit 102.
- the motion blur removal unit 105 sends the updated frame image (motion blur-removed frame image) to the subsequent selector 107.
- step S707 the selector 107 transfers, to the image output unit 108, an output (frame image) selected by the switching signal input unit 109 from those from the motion blur removal unit 105 and multiple outline removal unit 106.
- the image output unit 108 outputs the frame image received from the selector 107 (first output and second output) .
- the processes in step S704 and subsequent steps are executed from each frame image.
- a movie display image with a natural motion and a still image display image free from a motion blur can be appropriately switched and output in accordance with an external request.
- each frame is processed.
- the present invention is not limited to this, and each pixel or each block may be processed.
- the multiple outline removal unit 106 detects an outline and applies a lowpass filter to it.
- the detection method and filter kernel are not limited to the foregoing examples.
- Fig. 2 is a block diagram exemplifying the functional arrangement of an encoding apparatus serving as an image processing apparatus according to the second embodiment.
- the encoding apparatus includes an image input unit 201, multiple outline removal unit 202, image encode unit 203, decode unit 204, difference extraction unit 205, difference image encode unit 206, multiplexing unit 207, photographing information input unit 208, and data output unit 209.
- the image input unit 201 sequentially receives frame images (intermittently photographed images) photographed intermittently by opening/closing a shutter in accordance with a predetermined intermittent pattern.
- the image input unit 201 sends the externally input frame images to the multiple outline removal unit 202 and difference extraction unit 205.
- the multiple outline removal unit 202 is identical to the multiple outline removal unit 106 described in the first embodiment.
- the multiple outline removal unit 202 updates the input frame image by blurring an outline in the frame image.
- the multiple outline removal unit 202 outputs the updated frame image to the subsequent image encode unit 203.
- the image encode unit 203 encodes the frame image input from the multiple outline removal unit 202 (first encoding) .
- the image encode unit 203 outputs the encoded frame image (encoding result) to the subsequent multiplexing unit 207.
- the image encode unit 203 outputs encoded intermediate information such as quantized DCT encoded information to the decode unit 204.
- the decode unit 204 decodes the encoded intermediate information received from the image encode unit 203, generating a decoded frame image.
- the decode unit 204 sends the decoded image generated in this manner to the difference extraction unit 205.
- the difference extraction unit 205 sequentially receives frame images from the image input unit 201 and decoded images from the decode unit 204.
- the difference extraction unit 205 extracts a difference image between corresponding frame images among frame images input from the image input unit 201 and decoded frame images input from the decode unit 204,
- the difference extraction unit 205 sends the extracted difference image to the subsequent difference image encode unit 206.
- the difference image encode unit 206 JPEG-compresses sequentially input difference images (second encoding) .
- the photographing information input unit 208 receives photographing information described in the first embodiment.
- the photographing information input unit 208 sends the received photographing information to the subsequent multiplexing unit 207.
- the multiplexing unit 207 multiplexes the photographing information input from the photographing information input unit 208, the encoding result (difference stream) sent from the difference image encode unit 206, and the encoding result (main stream) sent from the image encode unit 203.
- the multiplexing unit 207 sends the multiplexing result as a stream to the data output unit 209.
- the data output unit 209 outputs the stream received from the multiplexing unit 207.
- the output destination is not particularly limited.
- the stream may be output to a storage device such as a hard disk, or directly to a decoding apparatus having an arrangement shown in Fig. 3.
- Fig. 8 is a flowchart of processing performed by the encoding apparatus serving as an image processing apparatus according to the second embodiment,
- the image input unit 201 sequentially receives frame images (intermittently photographed images) .
- the image input unit 201 sends the frame images to the multiple outline removal unit 202 and difference extraction unit 205.
- step S802 the multiple outline removal unit 202 updates an input frame image by blurring an outline in the frame image.
- the multiple outline removal unit 202 outputs the updated frame image to the subsequent image encode unit 203.
- step S803 the image encode unit 203 encodes the frame image input from the multiple outline removal unit 202 according to the H.264 encoding scheme (first encoding) .
- the image encode unit 203 outputs the encoded frame image (encoding result) to the subsequent multiplexing unit 207. Further, the image encode unit 203 outputs encoded intermediate information such as quantized DCT encoded information to the decode unit 204.
- step S804 the decode unit 204 decodes the encoded intermediate information received from the image encode unit 203, generating a decoded frame image, The decode unit 204 sends the decoded image generated in this fashion to the difference extraction unit 205.
- step S805 the difference extraction unit 205 extracts a difference image between corresponding frame images among frame images input from the image input unit 201 and decoded frame images input from the decode unit 204.
- the difference extraction unit 205 sends the extracted difference image to the subsequent difference image encode unit 206.
- step S806 the difference image encode unit 206 JPEG-compresses sequentially input difference images (second encoding) .
- step S807 the multiplexing unit 207 multiplexes the photographing information input from the photographing information input unit 208, the encoding result (difference stream) sent from the difference image encode unit 206, and the encoding result (main stream) sent from the image encode unit 203.
- the multiplexing unit 207 sends the multiplexing result as a stream to the data output unit 209.
- the data output unit 209 outputs the stream received from the multiplexing unit 207.
- Fig. 3 is a block diagram exemplifying the functional arrangement of a decoding apparatus serving as an image processing apparatus according to the second embodiment. As shown in Fig.
- the decoding apparatus includes a switching signal input unit 308, data input unit 301, data demultiplexing unit 302, main stream decode unit 303, difference stream decode unit 304, image composition unit 305, motion blur removal unit 306, selector 307, and image output unit 309.
- the data input unit 301 receives a stream generated by the encoding apparatus having the arrangement shown in Fig. 2.
- the data input unit 301 sends the stream to the subsequent data demultiplexing unit 302.
- the data demultiplexing unit 302 sends a main stream contained in the stream to the main stream decode unit 303 and a difference stream to the difference stream decode unit 304.-
- the data demultiplexing unit 302 sends photographing information contained in the stream to the motion blur removal unit 306.
- the main stream decode unit 303 decodes the main stream (first decoding) .
- the difference stream decode unit 304 decodes the difference stream (second decoding) .
- the switching signal input unit 308 receives an instruction representing which of a motion blur-removed image and multiple outline-removed image is to be output.
- the switching signal input unit 308 controls the main stream decode unit 303 and difference stream decode unit 304 to set the image composition unit 305 as their output destination.
- the image composition unit 305 receives the decoding results of the main and difference streams, [0070]
- the image composition unit 305 composites the decoding results of the main and difference streams, More specifically, the image composition unit 305 generates, for each frame, an image (decoded multiple outline image) obtained by compositing the decoded frame image and decoded difference image.
- the image composition unit 305 sends the decoded multiple outline image to the motion blur removal unit 306.
- the motion blur removal unit 306 is identical to the motion blur removal unit 105 shown in Fig. 1.
- the motion blur removal unit 306 updates the decoded multiple outline image by removing a motion blur from the decoded multiple outline image by using the photographing information received from the data demultiplexing unit 302.
- the motion blur removal unit 306 sends the updated decoded multiple outline image to the selector 307. In this case, the selector 307 sends the updated multiple outline-removed image to the subsequent image output unit 309.
- the switching signal input unit 308 controls the main stream decode unit 303 to set the selector 307 as the output destination.
- the selector 307 sends the frame image decoded by the main stream decode unit 303 to the subsequent image output unit 309.
- the image output unit 309 outputs an image received from the selector 307.
- the output destination is not particularly limited, similar to the first embodiment .
- Fig. 9 is a flowchart of processing performed by the decoding apparatus serving as an image processing apparatus according to the second embodiment,
- the data input unit 301 acquires a stream and sends it to the subsequent data demultiplexing unit 302.
- step S902 the data demultiplexing unit 302 sends a main stream contained in the stream to the main stream decode unit 303 and a difference stream to the difference stream decode unit 304.
- the data demultiplexing unit 302 sends photographing information contained in the stream to the motion blur removal unit 306.
- the switching signal input unit 308 controls the main stream decode unit 303 and difference stream decode unit 304 to set the image composition unit 305 as their output destination. In this case, the process advances to step S904 via step
- step S904 the main stream decode unit
- step S905 the difference stream decode unit 304 decodes the difference stream.
- step S906 the image composition unit
- the image composition unit 305 composites the decoding results of the main and difference streams. More specifically, the image composition unit 305 generates, for each frame, an image (decoded multiple outline image) obtained by compositing the decoded frame image and decoded difference image. The image composition unit 305 sends the decoded multiple outline image to the motion blur removal unit 306.
- step S907 the motion blur removal unit
- the motion blur removal unit 306 updates the decoded multiple outline image by removing a motion blur from the decoded multiple outline image by using the photographing information received from the data demultiplexing unit 302.
- the motion blur removal unit 306 sends the updated decoded multiple outline image to the selector 307.
- the selector 307 sends the updated multiple outline-removed image to the subsequent image output unit 309.
- the switching signal input unit 308 controls the main stream decode unit 303 to set the selector 307 as the output destination. In this case, the process advances to step S908 via step S903.
- step S908 the main stream decode unit 303 decodes the main stream.
- the selector 307 sends the frame image decoded by the main stream decode unit 303 to the subsequent image output unit 309.
- the image output unit 309 outputs an image received from the selector 307.
- the second embodiment assumes that motion information is contained in advance in photographing information. However, the motion information can be dynamically attained by obtaining a motion vector using each frame image. Thus, the motion information need not always be contained in advance in photographing information.
- the second embodiment adopts H.264 as a main stream encoding scheme and JPEG as a difference stream encoding scheme, but the present invention is not limited to them.
- each frame is processed.
- the present invention is not limited to this, and each pixel or each block may be processed.
- data are directly exchanged between the respective units for descriptive convenience. Alternatively, data to be exchanged may be temporarily stored in a memory to transfer the stored data to the next transfer destination. This arrangement may be adopted for any purpose.
- the multiple outline removal unit 202 detects an outline and applies a lowpass filter to it. However, the detection method and filter kernel are not limited to the foregoing examples.
- a difference image is generated for each frame, but the present invention is not limited to this, A difference image may be generated periodically or for only an arbitrary frame such as a frame after a scene change .
- a main stream, difference stream, and photographing information are associated by adding the identification code of the main stream to the difference stream and photographing information.
- the association method is not limited to this.
- streams and information can be associated with each other by arranging them in a predetermined order and combining them as a set of streams and information for the same image.
- a decoding apparatus having the arrangement shown in Fig. 3 decodes data generated by an encoding apparatus having the arrangement shown in Fig. 2. An image almost free from jerkiness and an image almost free from a motion blur can be output by switching them in accordance with an external signal.
- a module which executes motion blur removal processing having a relatively heavy processing load is arranged in the decoding apparatus. This arrangement can minimize an increase in power consumption of an encoding apparatus incorporated in a camera which is often driven by a battery. [0090] [Third Embodiment]
- Fig. 10 is a block diagram exemplifying the hardware configuration of a computer applicable to an image processing apparatus (including encoding and decoding apparatuses) in each embodiment described above.
- a CPU 1001 controls the whole computer using computer programs and data stored in a RAM 1002 and ROM 1003. Also, the CPU 1001 executes the processes, which are performed in the above description by an apparatus to which a computer is applied.
- the RAM 1002 has an area for temporarily storing computer programs and data loaded from an external storage device 1006, data externally acquired via an I/F (interface) 1007, and the like.
- the RAM 1002 further has a work area used when the CPU 1001 executes various processes. That is, the RAM 1002 can properly provide a variety of areas.
- the ROM 1003 stores a boot program, setting data of the computer, and the like.
- An operation unit 1004 includes a keyboard and mouse. By manipulating the operation unit 1004, the operator of the computer can input various instructions to the CPU 1001. For example, the operator may input the above-mentioned instruction via the operation unit 1004.
- a display unit 1005 is formed from a CRT, liquid crystal display, or the like and can display the result of processing by the CPU 1001 as an image or text .
- the external storage device 1006 is a large-capacity information storage device typified by a hard disk drive.
- the external storage device 1006 saves an OS (Operating System) , and computer programs and data for causing the CPU 1001 to achieve the functions of the respective units shown in Figs. 1, 2, and 3.
- the external storage device 1006 may also save, e.g., movie data and photographing information to be processed.
- Computer programs and data saved in the external storage device 1006 are appropriately loaded to the RAM 1002 under the control of the CPU 1001 and processed by the CPU 1001.
- the I/F 1007 communicates data with an external device. For example, when the computer is applied to the encoding apparatus, the I/F 1007 is used to communicate data with the decoding apparatus.
- a bus 1008 connects these units.
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment (s) , and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment (s) .
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium) .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Picture Signal Circuits (AREA)
- Television Signal Processing For Recording (AREA)
- Image Processing (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/143,582 US8867903B2 (en) | 2009-03-06 | 2010-01-26 | Image processing apparatus and image processing method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2009054064A JP5322704B2 (ja) | 2009-03-06 | 2009-03-06 | 画像処理装置、画像処理方法 |
| JP2009-054064 | 2009-03-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010100985A1 true WO2010100985A1 (en) | 2010-09-10 |
Family
ID=42709548
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/051310 Ceased WO2010100985A1 (en) | 2009-03-06 | 2010-01-26 | Image processing apparatus and image processing method |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US8867903B2 (enExample) |
| JP (1) | JP5322704B2 (enExample) |
| WO (1) | WO2010100985A1 (enExample) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20160039497A (ko) * | 2014-10-01 | 2016-04-11 | 삼성전자주식회사 | 영상 처리 장치, 디스플레이 장치 및 그 영상 처리 방법 |
| CN116055876A (zh) * | 2021-10-27 | 2023-05-02 | 北京字跳网络技术有限公司 | 一种视频处理方法、装置、电子设备和存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003006648A (ja) * | 2001-06-26 | 2003-01-10 | Sony Corp | 画像処理装置および方法、記録媒体、並びにプログラム |
| JP2004126591A (ja) * | 2002-10-02 | 2004-04-22 | Lg Electron Inc | プラズマディスプレイパネルの駆動方法および駆動装置 |
| JP2005192190A (ja) * | 2003-12-01 | 2005-07-14 | Pioneer Plasma Display Corp | 動画偽輪郭低減方法、動画偽輪郭低減回路、表示装置及びプログラム |
| JP2006259689A (ja) * | 2004-12-02 | 2006-09-28 | Seiko Epson Corp | 画像表示方法、画像表示装置およびプロジェクタ |
| JP2007274299A (ja) * | 2006-03-31 | 2007-10-18 | Sony Corp | 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004312495A (ja) * | 2003-04-09 | 2004-11-04 | Fuji Photo Film Co Ltd | 画像処理プログラム及び画像処理装置 |
| US7978164B2 (en) * | 2005-03-30 | 2011-07-12 | Sharp Kabushiki Kaisha | Liquid crystal display device |
| JP4678603B2 (ja) * | 2007-04-20 | 2011-04-27 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
| JP4139430B1 (ja) * | 2007-04-27 | 2008-08-27 | シャープ株式会社 | 画像処理装置及び方法、画像表示装置及び方法 |
-
2009
- 2009-03-06 JP JP2009054064A patent/JP5322704B2/ja not_active Expired - Fee Related
-
2010
- 2010-01-26 US US13/143,582 patent/US8867903B2/en not_active Expired - Fee Related
- 2010-01-26 WO PCT/JP2010/051310 patent/WO2010100985A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003006648A (ja) * | 2001-06-26 | 2003-01-10 | Sony Corp | 画像処理装置および方法、記録媒体、並びにプログラム |
| JP2004126591A (ja) * | 2002-10-02 | 2004-04-22 | Lg Electron Inc | プラズマディスプレイパネルの駆動方法および駆動装置 |
| JP2005192190A (ja) * | 2003-12-01 | 2005-07-14 | Pioneer Plasma Display Corp | 動画偽輪郭低減方法、動画偽輪郭低減回路、表示装置及びプログラム |
| JP2006259689A (ja) * | 2004-12-02 | 2006-09-28 | Seiko Epson Corp | 画像表示方法、画像表示装置およびプロジェクタ |
| JP2007274299A (ja) * | 2006-03-31 | 2007-10-18 | Sony Corp | 画像処理装置、および画像処理方法、並びにコンピュータ・プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US8867903B2 (en) | 2014-10-21 |
| JP2010212800A (ja) | 2010-09-24 |
| US20110274412A1 (en) | 2011-11-10 |
| JP5322704B2 (ja) | 2013-10-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10134117B2 (en) | Method and apparatus for viewing images | |
| US8265167B2 (en) | Application specific video format | |
| CN112565603B (zh) | 图像处理方法、装置及电子设备 | |
| WO2007124360A3 (en) | Image stabilization method | |
| KR20150145725A (ko) | Ldr 비디오 시퀀스의 동적 범위 확장을 위한 방법 및 장치 | |
| US10616502B2 (en) | Camera preview | |
| JP2009500931A (ja) | デジタル画像処理装置および処理方法ならびにコンピュータプログラムプロダクト | |
| US20050169537A1 (en) | System and method for image background removal in mobile multi-media communications | |
| JP2000187478A (ja) | 画像処理装置及び画像処理方法 | |
| US8867903B2 (en) | Image processing apparatus and image processing method | |
| CN111510629A (zh) | 数据显示方法、图像处理器、拍摄装置和电子设备 | |
| CN101563914A (zh) | 具有变焦功能的运动画面拍摄装置、图像处理和显示方法及程序 | |
| US20230206449A1 (en) | Computer Software Module Arrangement, a Circuitry Arrangement, an Arrangement and a Method for Improved Image Processing | |
| JP2007134788A (ja) | 画像処理装置及びプログラム | |
| JP2004179886A (ja) | 画像データの平滑化処理装置、平滑化処理方法及び平滑化処理プログラム | |
| US8803949B2 (en) | Reproducing apparatus and reproducing method | |
| JP3594355B2 (ja) | 映像信号蓄積伝送方法および装置 | |
| CN117764834A (zh) | 一种图像复原方法、装置和电子设备 | |
| CN120547433A (zh) | 图像处理方法、装置、电子设备及可读存储介质 | |
| JPH10301556A (ja) | 画像表示制御装置および方法 | |
| CN115643407A (zh) | 视频处理方法及其相关设备 | |
| CN119211708A (zh) | 图像处理电路、方法、电子设备及芯片 | |
| KR20110070763A (ko) | Gop 구조 추출 방법 | |
| JP2007013398A (ja) | ポストフィルタ、ポストフィルタリングプログラムおよび電子情報機器 | |
| JP2009071679A (ja) | 画像処理装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10748582 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13143582 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10748582 Country of ref document: EP Kind code of ref document: A1 |