WO2012008117A1 - Image encoding device and camera - Google Patents

Image encoding device and camera Download PDF

Info

Publication number
WO2012008117A1
WO2012008117A1 PCT/JP2011/003835 JP2011003835W WO2012008117A1 WO 2012008117 A1 WO2012008117 A1 WO 2012008117A1 JP 2011003835 W JP2011003835 W JP 2011003835W WO 2012008117 A1 WO2012008117 A1 WO 2012008117A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
unit
encoding device
processing unit
Prior art date
Application number
PCT/JP2011/003835
Other languages
French (fr)
Japanese (ja)
Inventor
義昭 三又
賢二 岩橋
友和 内田
裕規 吉川
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012008117A1 publication Critical patent/WO2012008117A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Definitions

  • the present invention relates to an image encoding device that encodes input moving image data indicating an input image obtained by imaging, and a camera including the image encoding device.
  • Patent Document 1 and Patent Document 2 disclose a camera that obtains encoded data by encoding moving image data indicating a captured image using an MPEG (Moving-Picture Experts Group) system.
  • MPEG Motion-Picture Experts Group
  • Patent Document 1 encodes a captured image as an I picture to prevent deterioration in image quality due to a decrease in correlation between frames during zooming.
  • the camera disclosed in Patent Document 2 decreases the zoom speed when the quantization coefficient set based on the bit rate of the obtained encoded data is large, while increasing the zoom speed when the quantization coefficient is small. In this way, deterioration of image quality is prevented.
  • an object of the present invention is to reduce the amount of encoded data and prevent deterioration of image quality.
  • a camera when a camera performs one operation of zoom, pan, and tilt, an input moving image indicating an input image obtained by imaging of the camera
  • the operation is executed at a degree lower than the predetermined degree with respect to the input image when the operation is executed at a predetermined degree.
  • An insertion process that inserts a guess image in the case of being performed between the image immediately before the operation and the image immediately after the action, and a replacement process that replaces the image immediately before the action or the image immediately after the action with the guess image
  • An image processing unit that executes any one of the image processing, and an image code that encodes moving image data indicating a moving image after image processing by the image processing unit and outputs encoded data
  • a section when the operation is a zoom, the degree of the magnification, when the operation is a panning or tilting, the degree is characterized by a moving distance.
  • the image processing unit executes the insertion process
  • the difference between the image immediately before the operation and the estimated image, and immediately after the estimated image and the operation is smaller than the difference between the image immediately before the operation and the image immediately after the operation.
  • the difference between the image immediately before the operation and the estimated image, or the difference between the estimated image and the image immediately after the operation is the difference between the image immediately before the operation and the image immediately after the operation. Smaller than. Therefore, it is possible to prevent deterioration in image quality due to a decrease in correlation between images.
  • the code amount of the encoded data can be suppressed, it is possible to prevent deterioration in image quality due to insufficient bandwidth.
  • the present invention it is possible to prevent deterioration in image quality due to a decrease in correlation between images.
  • the code amount of the encoded data can be suppressed, it is possible to prevent deterioration in image quality due to insufficient bandwidth.
  • FIG. 1 is a block diagram showing a configuration of a camera according to Embodiment 1 of the present invention.
  • FIG. 2 is a flowchart showing the operation of the image coding apparatus according to the first embodiment of the present invention.
  • FIG. 3 is an explanatory diagram illustrating image processing performed by the image processing unit according to the first embodiment of the invention.
  • FIG. 4 is a block diagram showing a configuration of a camera according to Embodiment 2 of the present invention.
  • FIG. 5 is an explanatory diagram illustrating a threshold table according to the second embodiment of the present invention.
  • FIG. 6 is a flowchart showing the operation of the image coding apparatus according to the second embodiment of the present invention.
  • FIG. 1 is a block diagram showing a configuration of a camera according to Embodiment 1 of the present invention.
  • FIG. 2 is a flowchart showing the operation of the image coding apparatus according to the first embodiment of the present invention.
  • FIG. 3 is an explanatory diagram illustrating image processing performed by the image
  • FIG. 7 is a graph illustrating the relationship between the zoom magnification and the amount of change in the spatial frequency component.
  • FIG. 8 is a block diagram illustrating a configuration of a camera according to Embodiment 3 of the present invention.
  • FIG. 9 is a flowchart showing the operation of the image coding apparatus according to the third embodiment of the present invention.
  • the network camera 100 includes an image sensor 101, an image sensor drive unit 102, a moving image data generation unit 103, an image encoding device 104, and an external memory 105. And a transmission device 106.
  • the network camera 100 can perform zooming, panning, and tilting operations.
  • a part or all of the image sensor 101, the image sensor drive unit 102, the moving image data generation unit 103, the image encoding device 104, the external memory 105, and the transmission device 106 are formed by one-chip LSI (Large Scale Integration). Composed.
  • LSI Large Scale Integration
  • the image sensor 101 has a light receiving surface on which light from a subject forms an image, performs photoelectric conversion for converting the amount of light received on the light receiving surface (light reception amount) into an analog electric signal, and outputs the analog electric signal To do.
  • the image sensor driving unit 102 outputs a drive signal indicating the output timing of the analog electric signal by the image sensor 101 so that the output timing of the analog electric signal is every fixed period.
  • the moving image data generation unit 103 performs processing such as amplification and noise removal on the analog electric signal output by the image sensor 101 to convert it into a digital signal, and outputs it as input moving image data indicating an input image.
  • the image encoding device 104 includes an image processing unit 107, an image encoding unit 108, a component information acquisition unit 109, a determination unit 110, and a control unit 111.
  • the image processing unit 107 receives the input moving image data output from the moving image data generation unit 103, and executes image processing on the input image indicated by the input moving image data when executing enlargement (zoom-in) at a predetermined magnification. To do.
  • the image processing executed at this time is an insertion process in which an image obtained by enlarging an image immediately before enlargement at a magnification lower than the predetermined magnification is created as a guess image, and inserted between the image immediately before enlargement and the image immediately after enlargement. is there.
  • the image processing unit 107 also performs image processing on the input image indicated by the input moving image data when executing reduction (zoom out) at a predetermined magnification.
  • the image processing performed at this time is an insertion process in which an image obtained by reducing the image immediately before reduction at a magnification higher than the predetermined magnification is created as an estimated image and inserted between the image immediately before reduction and the image immediately after reduction. is there.
  • the magnification of the estimated image is a value indicated by a parameter output by the control unit 111 described later.
  • the image encoding unit 108 performs encoding processing using inter-frame prediction on the moving image data indicating the moving image after the image processing by the image processing unit 107, and outputs the encoded data.
  • the encoding process includes processes such as a motion search process for generating a motion vector by selecting the closest macroblock from the immediately preceding encoded frame, DCT conversion, and quantization, and compresses moving image data. .
  • the encoded data is stored in an external memory (recording medium) 105.
  • the component information acquisition unit 109 acquires component information such as a luminance component and a spatial frequency component based on the input moving image data output from the moving image data generation unit 103, and via a user interface (not shown).
  • the zoom magnification input by the user hereinafter referred to as “acquisition magnification”.
  • Autofocus AutoFocus
  • Automatic Exposure Automatic Exposure
  • the determination unit 110 determines whether to cause the image processing unit 107 to execute the image processing based on the acquisition magnification acquired by the component information acquisition unit 109. Specifically, first, the determination unit 110 compares the acquisition magnification with a predetermined threshold when performing an enlargement operation (zoom-in). Then, based on the comparison result, when the acquisition magnification is larger than the predetermined threshold value, that is, when the enlargement operation to be executed is larger than the reference corresponding to the predetermined threshold value, the image processing unit 107. To execute the image processing.
  • the image processing unit 107 when the acquisition magnification is equal to or smaller than the predetermined threshold, that is, when the enlargement operation to be performed is not much larger than the reference corresponding to the predetermined threshold, the image processing unit 107 must not execute the image processing. judge.
  • the determination unit 110 compares the acquisition magnification with a predetermined threshold when performing a reduction operation (zoom out). Then, based on the comparison result, when the acquisition magnification is smaller than the predetermined threshold, that is, when the reduction operation to be performed is larger than the reference corresponding to the predetermined threshold, the image processing unit 107. To execute the image processing.
  • the image processing unit 107 must not execute the image processing. judge.
  • the control unit 111 When the determination unit 110 determines that the image processing unit 107 is to execute the image processing, the control unit 111 performs a first control that causes the image processing unit 107 to execute the image processing. When the unit 110 determines that the image processing unit 107 does not execute the image processing, the image processing unit 107 performs second control that does not execute the image processing. In addition, when performing the first control, the control unit 111 outputs a parameter indicating the reduction magnification or enlargement magnification of the estimated image.
  • the external memory 105 is used as a work area by the image encoding device 104.
  • the transmission device 106 transmits the encoded data output from the image encoding device 104 to the network.
  • FIG. 2 is a flowchart showing the operation of the image encoding device 104 when zooming in.
  • the component information acquisition unit 109 acquires component information such as a luminance component and a spatial frequency component based on the input moving image data output from the moving image data generation unit 103, and a user interface (not shown). The zoom magnification input by the user via the) is acquired. Then, the process proceeds to (S2002).
  • the determination unit 110 determines whether or not the zoom magnification acquired by the component information acquisition unit 109 in (S2001) is greater than or equal to a predetermined threshold value. If the zoom magnification is greater than or equal to the predetermined threshold, the process proceeds to (S2003). If the zoom magnification is less than the predetermined threshold, the process proceeds to (S2006).
  • control unit 111 calculates a parameter indicating the magnification of the estimated image based on the zoom magnification acquired by the component information acquisition unit 109 in (S2001).
  • the control unit 111 performs first control for causing the image processing unit 107 to execute the image processing.
  • the image processing unit 107 creates an image obtained by enlarging the image immediately before zooming with the enlargement magnification indicated by the parameter calculated in (S2003) as a guess image, and the image immediately before enlargement indicated by the input moving image data Insert between the image just after enlargement.
  • the image coding unit 108 performs coding processing on the moving image data indicating the moving image after the processing in (S2004), and outputs the coded data.
  • control unit 111 performs the second control for causing the image processing unit 107 not to execute the image processing. Therefore, the image processing unit 107 does not execute the image processing.
  • the image encoding unit 108 performs an encoding process on the input moving image data and outputs the encoded data.
  • FIG. 3 illustrates image processing performed by the image processing unit 107.
  • 301a and 301d each indicate an image of the entire imaging region, and 302 indicates a subject.
  • a portion surrounded by a double line indicates a method for obtaining a magnification at which the enlarged image of the subject 302 does not exceed the search range 303.
  • the determination unit 110 determines that the magnification of 4 times is equal to or greater than a predetermined threshold, and the control unit 111 Outputs a parameter indicating 2 times smaller than 4 times. Then, the image processing unit 107 inserts the image 301b obtained by enlarging the image 301a with the magnification indicated by the parameter output from the control unit 111, that is, 2 times, between the images 301a and 301d as an estimated image.
  • the generation of the guess image may be performed by enlarging only the subject portion of the image 301a or by enlarging the entire image 301a. The same applies to reduction.
  • the code amount of the encoded data can be suppressed.
  • an image 301c (an image obtained by magnifying the image 301a by 2 times) may be inserted between the image 301a and the image 301d, or only the bag image 301c may be inserted. May be.
  • the threshold value of the determination unit 110 so that the search range for obtaining a motion vector does not exceed the predetermined search range 303, it is input to the image encoding unit 108 next to the image 301a.
  • the image is controlled by the image processing unit 107, and the code amount of the encoded data can be suppressed within a certain range.
  • the difference between the image immediately before the operation and the estimated image, and the difference between the estimated image and the image immediately after the operation are smaller than the difference between the image immediately before the operation and the image immediately after the operation. Therefore, deterioration of image quality due to a decrease in correlation between images is prevented, and the code amount (bit rate) of encoded data can be suppressed. As a result, image quality deterioration due to insufficient bandwidth of the transmission path is prevented, and a stable image can be supplied. Further, it is possible to efficiently assign the code amount to an image that needs to be assigned a large amount of code other than during zooming (camera operation).
  • the determination unit 110 determines whether to cause the image processing unit 107 to execute the image processing based on the acquisition magnification acquired by the component information acquisition unit 109, a reduction in correlation between images due to zooming. Can affect the encoded data before the image processing unit 107 can start the image processing. Accordingly, it is possible to prevent deterioration in image quality.
  • the determination unit 110 may perform determination based on the picture type of the input image immediately before zooming in addition to the acquisition magnification. For example, when the picture type of the input image immediately before zooming is an I picture (intra-coded image), image processing is performed even if the zoom to be executed is larger than the reference corresponding to the predetermined threshold value. It may be determined that the image processing is not executed by the unit 107. In addition, when the picture type of the input image immediately before zooming is a P picture (Predictive encoded image), the determination unit 110 makes the allowable inter-frame difference smaller than when the picture type is an I picture. You may set the threshold value used for determination, and the parameter output by the control part 111. FIG.
  • the determination unit 110 receives the spatial frequency component acquired by the component information acquisition unit 109, and replaces the zoom magnification with the image immediately before zooming. Based on the amount of change in the spatial frequency component with the image immediately after zooming, it is determined whether or not the image processing unit 107 is to execute the image processing. Specifically, when the amount of change in the spatial frequency component of the image before and after zooming is equal to or greater than a predetermined threshold, it is determined that the image processing unit 107 performs image processing, while the amount of change is less than the predetermined threshold. In some cases, it is determined that the image processing unit 107 does not execute image processing.
  • the threshold to be compared with the change amount of the spatial frequency component in the determination of the determination unit 110 can be set based on the relationship among the interframe difference, the zoom magnification, and the change amount of the spatial frequency component. For example, the threshold is set to the amount of change in the spatial frequency component when zooming in at the maximum magnification among the enlargement magnifications in which the search range for obtaining the motion vector does not exceed a predetermined range.
  • control unit 111 holds a correspondence table indicating the correspondence between the amount of change in the spatial frequency component and the zoom magnification, and enlarges / reduces the estimated image, i.e., how many times the image immediately before zooming is multiplied.
  • a parameter indicating whether to create may be determined based on the correspondence table.
  • the determination unit 110 receives the luminance component acquired by the component information acquisition unit 109, and instead of the zoom magnification, the image immediately before zooming and the zoom It is determined whether or not the image processing unit 107 is to execute the image processing based on the change amount of the luminance component with the immediately following image. Specifically, if the amount of change in the luminance component of the image before and after zooming is greater than or equal to a predetermined threshold, the image processing unit 107 determines that image processing is to be performed, while the amount of change is less than the predetermined threshold. In this case, it is determined that the image processing unit 107 does not execute image processing.
  • the threshold value to be compared with the change amount of the luminance component in the determination by the determination unit 110 can be set based on the relationship among the interframe difference, the zoom magnification, and the change amount of the luminance component.
  • the threshold value may be set to the amount of change in the luminance component when zooming in at the maximum magnification among the enlargement magnifications in which the search range for obtaining the motion vector does not exceed a predetermined range.
  • control unit 111 holds a correspondence table indicating the correspondence between the change amount of the luminance component and the zoom magnification, and creates an estimated image by multiplying the magnification / reduction magnification of the estimated image, that is, the image immediately before zooming.
  • a parameter indicating such may be determined based on the correspondence table.
  • ⁇ Modification 3 of Embodiment 1 >> In the network camera 100 according to the second modification of the first embodiment of the present invention, instead of the insertion process in which the image processing unit 107 inserts a guess image between the image immediately before zooming and the image immediately after zooming when zooming. Then, a replacement process for replacing the image immediately before zooming with the estimated image is performed.
  • a replacement process for replacing the image immediately after zooming with the estimated image may be performed.
  • the image processing unit 107 replaces the function of executing the image processing with respect to the input image at the time of zooming, while performing panning and tilting. A function of executing the following image processing on the input image at the time of execution is provided.
  • Image processing that is performed at the time of panning or tilting creates an estimated image by moving the image immediately before panning or tilting by a moving distance shorter than the moving distance of panning or tilting.
  • an insertion process is performed for the input image, which is inserted between an image immediately before execution and an image immediately after panning or tilting. For example, when the movement amount from a predetermined position of the image immediately before execution of panning or tilting is 1 and the movement amount from the predetermined position of the image immediately after execution of panning or tilting is 4, An image having a movement amount of 2 from a predetermined position is inserted as an estimated image between images before and after execution of panning or tilting.
  • the determination unit 110 determines that the image processing unit 107 performs image processing when the pan or tilt movement distance is equal to or greater than a predetermined threshold, and when the movement distance is less than the predetermined threshold. It is determined that the image processing unit 107 does not execute image processing.
  • the threshold value of the determination unit 110 so that the search range for obtaining a motion vector does not exceed a predetermined search range
  • an image input to the image encoding unit 108 is converted into an image processing unit 107. Therefore, the code amount of the encoded data can be suppressed within a certain range. Also, the code amount of the encoded data can be stabilized by executing the same steps again on the image inserted as the estimated image.
  • the determination unit 110 receives the spatial frequency component acquired by the component information acquisition unit 109, and instead of the pan or tilt movement distance, Based on the amount of change in the spatial frequency component between the image immediately before the tilt and the image immediately after the pan or tilt, it is determined whether or not the image processing unit 107 is to execute the image processing of the fourth modification. You may do it. Specifically, the determination unit 110 determines that the image processing unit 107 performs the image processing of the fourth modification when the amount of change in the spatial frequency component of the image before and after panning or tilting is equal to or greater than a predetermined threshold. On the other hand, if the amount of change is less than the predetermined threshold, it may be determined that the image processing unit 107 does not execute the image processing of the fourth modification.
  • the determination unit 110 replaces the pan or tilt movement distance with an image immediately before pan or tilt and an image immediately after pan or tilt. Whether or not to cause the image processing unit 107 to execute the image processing of the fourth modification may be determined based on the amount of change in the luminance component between them. Specifically, the determination unit 110 determines that the image processing unit 107 performs the image processing of the fourth modification when the change amount of the luminance component of the image before and after panning or tilting is equal to or greater than a predetermined threshold. On the other hand, when the amount of change is less than the predetermined threshold, it may be determined that the image processing unit 107 does not execute the image processing of the fourth modification.
  • a replacement process for replacing an image immediately before execution of pan or tilt with the estimated image, or a replacement process for replacing an image immediately after execution of pan or tilt with the estimated image may be performed.
  • the camera 400 according to the second embodiment of the present invention includes an image encoding device 404 instead of the image encoding device 104 according to the first embodiment.
  • the image encoding device 404 includes a table storage unit 401, and includes a determination unit 410 and a control unit 411 instead of the determination unit 110 and the control unit 111.
  • the table storage unit 401 includes a threshold value table indicating correspondence between the amount of change in the spatial frequency component of the image before and after the operation (zoom) acquired by the component information acquisition unit 109 and the control method performed on the image processing unit 107.
  • the determination unit 410 is acquired from the threshold table stored in the table storage unit 401 based on the input moving image data by the component information acquisition unit 109.
  • a control method corresponding to the amount of change in the spatial frequency component of the image before and after the operation is specified.
  • Control is performed on the image processing unit 107 by the control method specified by the control unit 411 and the determination unit 410.
  • the control unit 411 causes the image processing unit 107 to perform the image processing when the amount of change in the spatial frequency component of the image before and after the operation is less than a.
  • the second control that does not execute is performed.
  • the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and indicates 3 as the magnification of the estimated image.
  • Output parameters when the change amount of the spatial frequency component is not less than b and less than c, the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and indicates 5 as the magnification of the estimated image.
  • the control unit 411 performs first control for causing the image processing unit 107 to execute the image processing and indicates 7 as the magnification of the estimated image. Output parameters.
  • the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and sets a parameter indicating 9 as the magnification of the estimated image. Output.
  • the threshold value table of FIG. 5 shows the correspondence between the amount of change in the spatial frequency component of the image before and after the operation and one of the first and second controls.
  • FIG. 6 is a flowchart showing the operation of the image encoding device 404 during zooming. 6, the same processes as those in FIG. 2 are denoted by the same reference numerals, and the description thereof is omitted.
  • the determination unit 410 determines whether there is a change in the component information acquired by the component information acquisition unit 109 in (S2001). If there is a change, the process proceeds to (S6003), whereas if there is no change, the process proceeds to (S2007).
  • the determination unit 410 specifies a control method corresponding to the spatial frequency component acquired by the component information acquisition unit 109 based on the input moving image data from the threshold value table stored in the table storage unit 401.
  • the control unit 411 controls the image processing unit 107 by the control method specified in (S6003).
  • the control unit 411 outputs a parameter indicating the magnification of the estimated image.
  • the image processing unit 107 operates according to control by the control unit 411.
  • the image processing unit 107 When executing the image processing, the image processing unit 107 generates an image obtained by enlarging the image immediately before zooming with the magnification indicated by the parameter output by the control unit 411 as the estimated image.
  • the image encoding unit 108 encodes the moving image data indicating the moving image after the image processing in (S6004). To output encoded data.
  • the image encoding unit 108 performs an encoding process on the input moving image data and outputs encoded data.
  • (i) is a graph showing the relationship between the zoom magnification (number of steps) and the amount of change in the spatial frequency component for the subject 1, and (ii) is the change in zoom magnification and the spatial frequency component for the subject 2.
  • a graph showing the relationship with the amount (iii) is a graph showing the relationship between the zoom magnification and the amount of change in the spatial frequency component for the subject 3.
  • the table storage unit 401 stores a plurality of threshold tables, and the determination unit 410 uses the control method for specifying the control method.
  • the threshold table to be selected may be selected according to the component information (image input condition) acquired by the component information acquisition unit 109.
  • the threshold table indicating the correspondence between the change amount of the spatial frequency component and the control method is used to specify the control method.
  • the correspondence between the change amount of the luminance component and the control method is shown.
  • a threshold table may be used instead.
  • the optimal image processing method is expected to exist in the vicinity of the control value determined last time. Therefore, the input image processing result such as the code amount of encoded data, frequency component information, and picture type may be reflected in the threshold value table every time it is obtained. That is, the threshold table may be updated based on the obtained encoded data.
  • the camera 800 according to the third embodiment of the present invention includes an image encoding device 804 instead of the image encoding device 404 according to the second embodiment.
  • the image encoding device 804 includes a threshold learning unit 801 in addition to the configuration of the image encoding device 404.
  • the threshold learning unit 801 confirms the relationship between the control method (image processing conditions) and the code amount of the encoded data, and reflects the relationship in the threshold table. Specifically, when the code amount of the encoded data is equal to or less than a predetermined threshold value, the change amount of the spatial frequency component and the zoom factor (magnification of the estimated image) at that time are associated with each other and written in the threshold value table. . If the change amount of the spatial frequency component at that time is already associated with a predetermined zoom magnification and stored in the threshold value table, the zoom magnification at that time is overwritten on the predetermined zoom magnification. On the other hand, if the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the zoom magnification are newly written in the threshold value table in association with each other.
  • FIG. 9 is a flowchart showing the operation of the image encoding device 804 during zooming. 9, the same processes as those in FIG. 6 are denoted by the same reference numerals, and the description thereof is omitted.
  • the determination unit 410 confirms the code amount of the encoded data in (S9001). At this time, if the code amount is equal to or smaller than the predetermined threshold, the process proceeds to (S9005), and if the code amount exceeds the predetermined threshold, the process proceeds to (S9002).
  • the control unit 411 requests the image processing unit 107 to perform image processing, and the image processing unit 107 generates input data of the image encoding unit 108.
  • the determination unit 410 specifies the control method corresponding to the spatial frequency component acquired from the threshold information stored in the table storage unit 401 based on the input moving image data by the component information acquisition unit 109, and is specified.
  • the control unit 411 controls the image processing unit 107 by the control method. Then, the process proceeds to (S9003).
  • the image encoding unit 108 performs an encoding process on the input data generated in (S9002). Then, the process proceeds to (S9004).
  • the determination unit 410 compares the code amount of the encoded data generated in (S9003) with a predetermined threshold value. If the code amount is equal to or less than the predetermined threshold, the process proceeds to (S9006). If the code amount exceeds the predetermined threshold, the process proceeds to (S9007).
  • the change amount of the spatial frequency component acquired in (S2001) is associated with the second control as a control method and written in the threshold value table.
  • the second control is overwritten as the control method.
  • the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the second control are newly associated with each other and written to the threshold value table.
  • the threshold learning unit 801 associates the change amount of the spatial frequency component acquired in (S2001) with the control method (number of steps (magnification of the estimated image)) in the immediately preceding image processing (S9002). Write to threshold table. If the change amount of the spatial frequency component at that time is already associated with a predetermined control method and stored in the threshold value table, only the control method is overwritten. On the other hand, if the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the control method of the immediately preceding image processing are associated with each other and a new threshold value is obtained. Write to the table.
  • control unit 411 next time so that the code amount of the encoded data obtained by the next encoding (S9003) is smaller than the code amount of the encoded data obtained by the previous encoding.
  • the control method (magnification of estimated image) in the image processing (S9002) is set.
  • the process is performed so as to optimize the code amount based on the encoding result, the encoding process more suitable for the scene can be performed.
  • the processes of (S9002) to (S9004) can be performed a plurality of times, but may be performed only once to simplify the process.
  • Embodiments 2 and 3 and Embodiments 2 and 3 can be applied to Modifications 3 and 4 of Embodiment 1 described above.
  • the image encoding device and the camera according to the present invention are useful as an image encoding device that encodes input moving image data indicating an input image obtained by imaging, and a camera including the image encoding device.
  • a camera including the image encoding device For example, it can be used for a digital camera or a network camera system.

Abstract

In the present disclosures, an image processing unit (107) executes—with respect to an input image from when one operation among zoom, pan, and tilt is executed to a predetermined degree—one type of image processing from among: insertion processing that inserts an inferred image, which is of when the aforementioned operation is executed to a degree that is lower than the predetermined degree, between the image immediately prior to the operation and the image immediately following the operation; and replacement processing that replaces the image immediately prior to the operation or the image immediately following the operation with the inferred image. An image encoding unit (108) outputs encoded data by encoding video data that indicates the video after image processing by the image processing unit (107).

Description

画像符号化装置及びカメラImage encoding apparatus and camera
 本発明は、撮像により得られた入力画像を示す入力動画像データに対して符号化を行う画像符号化装置、及び当該画像符号化装置を備えたカメラに関するものである。 The present invention relates to an image encoding device that encodes input moving image data indicating an input image obtained by imaging, and a camera including the image encoding device.
 特許文献1及び特許文献2には、撮影画像を示す動画像データに対し、MPEG(Moving Picture Experts Group)方式で符号化を行うことにより符号化データを得るカメラが開示されている。 Patent Document 1 and Patent Document 2 disclose a camera that obtains encoded data by encoding moving image data indicating a captured image using an MPEG (Moving-Picture Experts Group) system.
 特許文献1に開示されたカメラは、ズーム中に、フレーム間の相関の低下による画質の悪化を防ぐため、撮影画像をIピクチャとみなして符号化を行う。 The camera disclosed in Patent Document 1 encodes a captured image as an I picture to prevent deterioration in image quality due to a decrease in correlation between frames during zooming.
 また、特許文献2に開示されたカメラは、得られた符号化データのビットレートに基づいて設定される量子化係数が大きいときにはズーム速度を低くする一方、量子化係数が小さいときにはズーム速度を高めるようにして画質の劣化を防止している。 The camera disclosed in Patent Document 2 decreases the zoom speed when the quantization coefficient set based on the bit rate of the obtained encoded data is large, while increasing the zoom speed when the quantization coefficient is small. In this way, deterioration of image quality is prevented.
特開2004-180345号公報JP 2004-180345 A 特開2003-158661号公報JP 2003-158661 A
 しかし、上記特許文献1では、ズームが高速に行われるときに、撮影画像がIピクチャとみなされるため、符号化データの符号量が増加して符号化データの伝送経路の帯域が不足するおそれがある。また、帯域不足により画質が悪化するおそれがある。 However, in the above-mentioned Patent Document 1, when the zoom is performed at high speed, the captured image is regarded as an I picture, so that the code amount of the encoded data may increase and the bandwidth of the transmission path of the encoded data may be insufficient. is there. In addition, the image quality may deteriorate due to insufficient bandwidth.
 本発明は、上記の点に鑑み、符号化データの符号量を抑えるとともに、画質の悪化を防止することを目的とする。 In view of the above points, an object of the present invention is to reduce the amount of encoded data and prevent deterioration of image quality.
 上記の課題を解決するため、本発明の一態様は、ズーム、パン、及びチルトのうちの1つの動作をカメラが実行するときに、当該カメラの撮像により得られた入力画像を示す入力動画像データに対し、フレーム間予測を利用した符号化を行う画像符号化装置において、前記動作を所定の度合で実行したときの前記入力画像に対し、前記所定の度合よりも低い度合で前記動作が実行された場合の推測画像を前記動作直前の画像と前記動作直後の画像との間に挿入する挿入処理、及び前記動作直前の画像又は前記動作直後の画像を前記推測画像に置換する置換処理のうちのいずれか一方の画像処理を実行する画像処理部と、前記画像処理部による画像処理後の動画像を示す動画像データに対して符号化を行って符号化データを出力する画像符号化部とを備え、前記動作がズームであるとき、前記度合は倍率であり、前記動作がパン又はチルトであるとき、前記度合は移動距離であることを特徴とする。 In order to solve the above-described problem, according to one embodiment of the present invention, when a camera performs one operation of zoom, pan, and tilt, an input moving image indicating an input image obtained by imaging of the camera In an image encoding apparatus that performs encoding using inter-frame prediction on data, the operation is executed at a degree lower than the predetermined degree with respect to the input image when the operation is executed at a predetermined degree. An insertion process that inserts a guess image in the case of being performed between the image immediately before the operation and the image immediately after the action, and a replacement process that replaces the image immediately before the action or the image immediately after the action with the guess image An image processing unit that executes any one of the image processing, and an image code that encodes moving image data indicating a moving image after image processing by the image processing unit and outputs encoded data And a section, when the operation is a zoom, the degree of the magnification, when the operation is a panning or tilting, the degree is characterized by a moving distance.
 この態様によると、ズーム、パン、及びチルトのうちの1つの動作が実行された場合、画像処理部が挿入処理を実行すると、動作直前の画像と推測画像との差分、及び推測画像と動作直後の画像との差分は、動作直前の画像と動作直後の画像との差分よりも小さくなる。また、画像処理部が置換処理を実行した場合でも、動作直前の画像と推測画像との差分、又は推測画像と動作直後の画像との差分が、動作直前の画像と動作直後の画像との差分よりも小さくなる。したがって、画像間の相関の低下による画質の悪化を防止できる。また、符号化データの符号量を抑えられるので、帯域不足による画質の悪化を防止できる。 According to this aspect, when one operation of zoom, pan, and tilt is executed, when the image processing unit executes the insertion process, the difference between the image immediately before the operation and the estimated image, and immediately after the estimated image and the operation The difference between the first image and the second image is smaller than the difference between the image immediately before the operation and the image immediately after the operation. Even when the image processing unit executes the replacement process, the difference between the image immediately before the operation and the estimated image, or the difference between the estimated image and the image immediately after the operation is the difference between the image immediately before the operation and the image immediately after the operation. Smaller than. Therefore, it is possible to prevent deterioration in image quality due to a decrease in correlation between images. In addition, since the code amount of the encoded data can be suppressed, it is possible to prevent deterioration in image quality due to insufficient bandwidth.
 本発明により、画像間の相関の低下による画質の悪化を防止できる。また、符号化データの符号量を抑えられるので、帯域不足による画質の悪化を防止できる。 According to the present invention, it is possible to prevent deterioration in image quality due to a decrease in correlation between images. In addition, since the code amount of the encoded data can be suppressed, it is possible to prevent deterioration in image quality due to insufficient bandwidth.
図1は、本発明の実施形態1に係るカメラの構成を示すブロック図である。FIG. 1 is a block diagram showing a configuration of a camera according to Embodiment 1 of the present invention. 図2は、本発明の実施形態1に係る画像符号化装置の動作を示すフローチャートである。FIG. 2 is a flowchart showing the operation of the image coding apparatus according to the first embodiment of the present invention. 図3は、本発明の実施形態1に係る画像処理部により行われる画像処理を例示する説明図である。FIG. 3 is an explanatory diagram illustrating image processing performed by the image processing unit according to the first embodiment of the invention. 図4は、本発明の実施形態2に係るカメラの構成を示すブロック図である。FIG. 4 is a block diagram showing a configuration of a camera according to Embodiment 2 of the present invention. 図5は、本発明の実施形態2に係る閾値テーブルを例示する説明図である。FIG. 5 is an explanatory diagram illustrating a threshold table according to the second embodiment of the present invention. 図6は、本発明の実施形態2に係る画像符号化装置の動作を示すフローチャートである。FIG. 6 is a flowchart showing the operation of the image coding apparatus according to the second embodiment of the present invention. 図7は、ズームの倍率と空間周波数成分の変化量との関係を例示するグラフである。FIG. 7 is a graph illustrating the relationship between the zoom magnification and the amount of change in the spatial frequency component. 図8は、本発明の実施形態3に係るカメラの構成を示すブロック図である。FIG. 8 is a block diagram illustrating a configuration of a camera according to Embodiment 3 of the present invention. 図9は、本発明の実施形態3に係る画像符号化装置の動作を示すフローチャートである。FIG. 9 is a flowchart showing the operation of the image coding apparatus according to the third embodiment of the present invention.
 以下、本発明の実施形態について、図面を参照して説明する。なお、以下の各実施形態において、他の実施形態と同様の機能を有する構成要素については同一の符号を付して説明を省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In each of the following embodiments, components having functions similar to those of the other embodiments are denoted by the same reference numerals and description thereof is omitted.
 《実施形態1》
 本発明の実施形態1に係るネットワークカメラ100は、図1に示すように、撮像素子101と、撮像素子駆動部102と、動画像データ生成部103と、画像符号化装置104と、外部メモリ105と、伝送装置106とを備えている。ネットワークカメラ100は、ズーム、パン、及びチルトの動作を実行可能である。また、撮像素子101、撮像素子駆動部102、動画像データ生成部103、画像符号化装置104、外部メモリ105、及び伝送装置106の一部又は全部は、1チップのLSI(Large Scale Integration)によって構成される。
Embodiment 1
As shown in FIG. 1, the network camera 100 according to the first embodiment of the present invention includes an image sensor 101, an image sensor drive unit 102, a moving image data generation unit 103, an image encoding device 104, and an external memory 105. And a transmission device 106. The network camera 100 can perform zooming, panning, and tilting operations. In addition, a part or all of the image sensor 101, the image sensor drive unit 102, the moving image data generation unit 103, the image encoding device 104, the external memory 105, and the transmission device 106 are formed by one-chip LSI (Large Scale Integration). Composed.
 撮像素子101は、被写体からの光が結像する受光面を有し、その受光面に受けた光の量(受光量)をアナログ電気信号に変換する光電変換を行い、そのアナログ電気信号を出力する。 The image sensor 101 has a light receiving surface on which light from a subject forms an image, performs photoelectric conversion for converting the amount of light received on the light receiving surface (light reception amount) into an analog electric signal, and outputs the analog electric signal To do.
 撮像素子駆動部102は、アナログ電気信号の出力タイミングが一定周期毎になるように、撮像素子101によるアナログ電気信号の出力タイミングを示す駆動信号を出力する。 The image sensor driving unit 102 outputs a drive signal indicating the output timing of the analog electric signal by the image sensor 101 so that the output timing of the analog electric signal is every fixed period.
 動画像データ生成部103は、撮像素子101によって出力されたアナログ電気信号に対し、増幅及びノイズ除去等の処理を施してデジタル信号に変換し、入力画像を示す入力動画像データとして出力する。 The moving image data generation unit 103 performs processing such as amplification and noise removal on the analog electric signal output by the image sensor 101 to convert it into a digital signal, and outputs it as input moving image data indicating an input image.
 画像符号化装置104は、画像処理部107、画像符号化部108、成分情報取得部109、判定部110、及び制御部111を備えている。 The image encoding device 104 includes an image processing unit 107, an image encoding unit 108, a component information acquisition unit 109, a determination unit 110, and a control unit 111.
 画像処理部107は、動画像データ生成部103により出力された入力動画像データを受け、所定倍率の拡大(ズームイン)の実行時に、入力動画像データにより示される入力画像に対して画像処理を実行する。このとき実行される画像処理は、拡大直前の画像を前記所定倍率よりも低い倍率で拡大した画像を推測画像として作成し、拡大直前の画像と拡大直後の画像との間に挿入する挿入処理である。また、画像処理部107は、所定倍率の縮小(ズームアウト)を実行するときにも、入力動画像データにより示される入力画像に対して画像処理を施す。このとき施される画像処理は、縮小直前の画像を前記所定倍率よりも高い倍率で縮小した画像を推測画像として作成し、縮小直前の画像と縮小直後の画像との間に挿入する挿入処理である。また、推測画像の倍率は、後述する制御部111により出力されるパラメータによって示される値となる。 The image processing unit 107 receives the input moving image data output from the moving image data generation unit 103, and executes image processing on the input image indicated by the input moving image data when executing enlargement (zoom-in) at a predetermined magnification. To do. The image processing executed at this time is an insertion process in which an image obtained by enlarging an image immediately before enlargement at a magnification lower than the predetermined magnification is created as a guess image, and inserted between the image immediately before enlargement and the image immediately after enlargement. is there. The image processing unit 107 also performs image processing on the input image indicated by the input moving image data when executing reduction (zoom out) at a predetermined magnification. The image processing performed at this time is an insertion process in which an image obtained by reducing the image immediately before reduction at a magnification higher than the predetermined magnification is created as an estimated image and inserted between the image immediately before reduction and the image immediately after reduction. is there. Moreover, the magnification of the estimated image is a value indicated by a parameter output by the control unit 111 described later.
 画像符号化部108は、画像処理部107による画像処理後の動画像を示す動画像データに対し、フレーム間予測を利用する符号化処理を行って符号化データを出力する。符号化処理は、直前の符号化フレームから最も近いマクロブロックを選択することにより動きベクトルを生成する動き探索処理、DCT変換、及び量子化等の処理を含み、動画像データを圧縮するものである。符号化データは、外部メモリ(記録媒体)105に格納される。 The image encoding unit 108 performs encoding processing using inter-frame prediction on the moving image data indicating the moving image after the image processing by the image processing unit 107, and outputs the encoded data. The encoding process includes processes such as a motion search process for generating a motion vector by selecting the closest macroblock from the immediately preceding encoded frame, DCT conversion, and quantization, and compresses moving image data. . The encoded data is stored in an external memory (recording medium) 105.
 成分情報取得部109は、動画像データ生成部103により出力された入力動画像データに基づいて、輝度成分、空間周波数成分等の成分情報を取得するとともに、ユーザインタフェース(図示せず)を介してユーザにより入力されたズームの倍率(以下、「取得倍率」と呼ぶ)を取得する。空間周波数成分の取得にはオートフォーカス(AutoFocus)が用いられ、輝度成分の取得には自動露出制御(Automatic Exposure)が用いられる。 The component information acquisition unit 109 acquires component information such as a luminance component and a spatial frequency component based on the input moving image data output from the moving image data generation unit 103, and via a user interface (not shown). The zoom magnification input by the user (hereinafter referred to as “acquisition magnification”) is acquired. Autofocus (AutoFocus) is used to acquire the spatial frequency component, and automatic exposure control (Automatic Exposure) is used to acquire the luminance component.
 判定部110は、成分情報取得部109により取得された取得倍率に基づいて、画像処理部107に前記画像処理を実行させるか否かの判定を行う。詳しくは、まず、判定部110は、拡大動作(ズームイン)の実行時には、前記取得倍率を所定の閾値と比較する。そして、その比較結果に基づいて、取得倍率が所定の閾値よりも大きい場合、すなわち実行される拡大動作が前記所定の閾値に対応する基準よりも大幅なものである場合には、画像処理部107に前記画像処理を実行させると判定する。一方、取得倍率が所定の閾値以下である場合、すなわち実行される拡大動作が前記所定の閾値に対応する基準よりも大幅なものでない場合には、画像処理部107に前記画像処理を実行させないと判定する。一方、判定部110は、縮小動作(ズームアウト)の実行時には、前記取得倍率を所定の閾値と比較する。そして、その比較結果に基づいて、取得倍率が所定の閾値よりも小さい場合、すなわち実行される縮小動作が前記所定の閾値に対応する基準よりも大幅なものである場合には、画像処理部107に前記画像処理を実行させると判定する。一方、取得倍率が所定の閾値以上である場合、すなわち実行される縮小動作が前記所定の閾値に対応する基準よりも大幅なものでない場合には、画像処理部107に前記画像処理を実行させないと判定する。 The determination unit 110 determines whether to cause the image processing unit 107 to execute the image processing based on the acquisition magnification acquired by the component information acquisition unit 109. Specifically, first, the determination unit 110 compares the acquisition magnification with a predetermined threshold when performing an enlargement operation (zoom-in). Then, based on the comparison result, when the acquisition magnification is larger than the predetermined threshold value, that is, when the enlargement operation to be executed is larger than the reference corresponding to the predetermined threshold value, the image processing unit 107. To execute the image processing. On the other hand, when the acquisition magnification is equal to or smaller than the predetermined threshold, that is, when the enlargement operation to be performed is not much larger than the reference corresponding to the predetermined threshold, the image processing unit 107 must not execute the image processing. judge. On the other hand, the determination unit 110 compares the acquisition magnification with a predetermined threshold when performing a reduction operation (zoom out). Then, based on the comparison result, when the acquisition magnification is smaller than the predetermined threshold, that is, when the reduction operation to be performed is larger than the reference corresponding to the predetermined threshold, the image processing unit 107. To execute the image processing. On the other hand, if the acquisition magnification is greater than or equal to a predetermined threshold, that is, if the reduction operation to be performed is not greater than the reference corresponding to the predetermined threshold, the image processing unit 107 must not execute the image processing. judge.
 制御部111は、判定部110により画像処理部107に前記画像処理を実行させると判定された場合には、画像処理部107に対し、前記画像処理を実行させる第1の制御を行う一方、判定部110により画像処理部107に前記画像処理を実行させないと判定された場合には、画像処理部107に対し、前記画像処理を実行させない第2の制御を行う。また、制御部111は、第1の制御を行う際、前記推測画像の縮小倍率又は拡大倍率を示すパラメータを出力する。 When the determination unit 110 determines that the image processing unit 107 is to execute the image processing, the control unit 111 performs a first control that causes the image processing unit 107 to execute the image processing. When the unit 110 determines that the image processing unit 107 does not execute the image processing, the image processing unit 107 performs second control that does not execute the image processing. In addition, when performing the first control, the control unit 111 outputs a parameter indicating the reduction magnification or enlargement magnification of the estimated image.
 外部メモリ105は、画像符号化装置104によってワーク領域(Work Area)として使用される。 The external memory 105 is used as a work area by the image encoding device 104.
 伝送装置106は、画像符号化装置104により出力された符号化データをネットワークに伝送する。 The transmission device 106 transmits the encoded data output from the image encoding device 104 to the network.
 図2は、ズームインの実行時における画像符号化装置104の動作を示すフローチャートである。 FIG. 2 is a flowchart showing the operation of the image encoding device 104 when zooming in.
 (S2001)において、成分情報取得部109が、動画像データ生成部103により出力された入力動画像データに基づいて、輝度成分、空間周波数成分等の成分情報を取得するとともに、ユーザインタフェース(図示せず)を介してユーザにより入力されたズームの倍率を取得する。そして、処理が(S2002)に進む。 In (S2001), the component information acquisition unit 109 acquires component information such as a luminance component and a spatial frequency component based on the input moving image data output from the moving image data generation unit 103, and a user interface (not shown). The zoom magnification input by the user via the) is acquired. Then, the process proceeds to (S2002).
 (S2002)において、判定部110が、(S2001)において成分情報取得部109により取得されたズームの倍率が所定の閾値以上であるか否かを判定する。ズームの倍率が所定の閾値以上である場合には処理が(S2003)に進む一方、ズームの倍率が所定の閾値未満である場合には処理が(S2006)に進む。 In (S2002), the determination unit 110 determines whether or not the zoom magnification acquired by the component information acquisition unit 109 in (S2001) is greater than or equal to a predetermined threshold value. If the zoom magnification is greater than or equal to the predetermined threshold, the process proceeds to (S2003). If the zoom magnification is less than the predetermined threshold, the process proceeds to (S2006).
 (S2003)において、制御部111が、(S2001)において成分情報取得部109により取得されたズームの倍率に基づいて、推測画像の拡大倍率を示すパラメータを算出する。 In (S2003), the control unit 111 calculates a parameter indicating the magnification of the estimated image based on the zoom magnification acquired by the component information acquisition unit 109 in (S2001).
 (S2004)において、制御部111が、画像処理部107に対し、前記画像処理を実行させる第1の制御を行う。これにより、画像処理部107が、(S2003)で算出されたパラメータにより示される拡大倍率でズーム直前の画像を拡大した画像を推測画像として作成し、入力動画像データにより示される拡大直前の画像と拡大直後の画像との間に挿入する。 (S2004), the control unit 111 performs first control for causing the image processing unit 107 to execute the image processing. As a result, the image processing unit 107 creates an image obtained by enlarging the image immediately before zooming with the enlargement magnification indicated by the parameter calculated in (S2003) as a guess image, and the image immediately before enlargement indicated by the input moving image data Insert between the image just after enlargement.
 (S2005)において、画像符号化部108が、(S2004)の処理後の動画像を示す動画像データに対して符号化処理を行って符号化データを出力する。 In (S2005), the image coding unit 108 performs coding processing on the moving image data indicating the moving image after the processing in (S2004), and outputs the coded data.
 (S2006)において、制御部111が、画像処理部107に対し、前記画像処理を実行させない第2の制御を行う。したがって、画像処理部107は、前記画像処理を実行しない。 (S2006), the control unit 111 performs the second control for causing the image processing unit 107 not to execute the image processing. Therefore, the image processing unit 107 does not execute the image processing.
 (S2007)において、画像符号化部108が、入力動画像データに対して符号化処理を行って符号化データを出力する。 (S2007), the image encoding unit 108 performs an encoding process on the input moving image data and outputs the encoded data.
 図3は、画像処理部107により行われる画像処理を例示する。 FIG. 3 illustrates image processing performed by the image processing unit 107.
 図3において、301a及び301dはそれぞれ撮影領域全体の画像を示し、302は被写体を示す。図3中、2重線で囲んだ部分は、被写体302の拡大画像が探索範囲303を超えない倍率を求める方法を示す。所定の倍率(図ではx1、x1.5、x2.0)で段階的にズームを行うことにより、探索範囲303を超えない倍率を計算する。 In FIG. 3, 301a and 301d each indicate an image of the entire imaging region, and 302 indicates a subject. In FIG. 3, a portion surrounded by a double line indicates a method for obtaining a magnification at which the enlarged image of the subject 302 does not exceed the search range 303. By zooming in steps at predetermined magnifications (x1, x1.5, and x2.0 in the figure), a magnification that does not exceed the search range 303 is calculated.
 図3の例では、画像301aに対して倍率4倍のズームを施して画像301dにする場合に、4倍の倍率が判定部110により所定の閾値以上であると判定され、制御部111が、4倍を小さくした2倍を示すパラメータを出力する。そして、画像処理部107は、制御部111により出力されたパラメータにより示された倍率、すなわち2倍で画像301aを拡大した画像301bを、推測画像として画像301aと画像301dとの間に挿入する。なお、推測画像の生成は、画像301aの被写体部分のみを拡大することによって行ってもよいし、画像301aの全体を拡大することによって行ってもよい。このことは、縮小の場合も同様である。 In the example of FIG. 3, when the image 301a is zoomed at a magnification of 4 to make the image 301d, the determination unit 110 determines that the magnification of 4 times is equal to or greater than a predetermined threshold, and the control unit 111 Outputs a parameter indicating 2 times smaller than 4 times. Then, the image processing unit 107 inserts the image 301b obtained by enlarging the image 301a with the magnification indicated by the parameter output from the control unit 111, that is, 2 times, between the images 301a and 301d as an estimated image. Note that the generation of the guess image may be performed by enlarging only the subject portion of the image 301a or by enlarging the entire image 301a. The same applies to reduction.
 画像301aと画像301bとの差分は、画像301aと画像301dとの差分よりも小さいので、符号化データの符号量が抑えられる。 Since the difference between the image 301a and the image 301b is smaller than the difference between the image 301a and the image 301d, the code amount of the encoded data can be suppressed.
 なお、画像301aと画像301dとの間に、画像301bに加えて画像301c(2倍で画像301aを拡大した画像)が挿入されるようにしてもよいし、 画像301cのみが挿入されるようにしてもよい。 In addition to the image 301b, an image 301c (an image obtained by magnifying the image 301a by 2 times) may be inserted between the image 301a and the image 301d, or only the bag image 301c may be inserted. May be.
 本実施形態によると、動きベクトルを求めるための探索範囲が所定の探索範囲303を超えないように判定部110の閾値を設定することにより、画像301aの次に画像符号化部108に入力される画像が画像処理部107によって制御され、符号化データの符号量を一定の範囲内に抑えることができる。 According to the present embodiment, by setting the threshold value of the determination unit 110 so that the search range for obtaining a motion vector does not exceed the predetermined search range 303, it is input to the image encoding unit 108 next to the image 301a. The image is controlled by the image processing unit 107, and the code amount of the encoded data can be suppressed within a certain range.
 また、動作直前の画像と推測画像との差分、及び推測画像と動作直後の画像との差分は、動作直前の画像と動作直後の画像との差分よりも小さくなる。したがって、画像間の相関の低下による画質の悪化が防止されるとともに、符号化データの符号量(ビットレート)を抑えられる。この結果、伝送路の帯域不足による画質の悪化が防止され、安定した画像を供給できる。また、ズーム(カメラ動作)時以外の符号量を多く割り当てる必要のある画像に対し、効率的に符号量を割り当てることができる。 Also, the difference between the image immediately before the operation and the estimated image, and the difference between the estimated image and the image immediately after the operation are smaller than the difference between the image immediately before the operation and the image immediately after the operation. Therefore, deterioration of image quality due to a decrease in correlation between images is prevented, and the code amount (bit rate) of encoded data can be suppressed. As a result, image quality deterioration due to insufficient bandwidth of the transmission path is prevented, and a stable image can be supplied. Further, it is possible to efficiently assign the code amount to an image that needs to be assigned a large amount of code other than during zooming (camera operation).
 また、判定部110が、成分情報取得部109により取得された取得倍率に基づいて、画像処理部107に前記画像処理を実行させるか否かの判定を行うので、ズームによる画像間の相関の低下が符号化データに影響する前に、画像処理部107が上記画像処理を開始できる。したがって、画質の悪化を未然に防止できる。 In addition, since the determination unit 110 determines whether to cause the image processing unit 107 to execute the image processing based on the acquisition magnification acquired by the component information acquisition unit 109, a reduction in correlation between images due to zooming. Can affect the encoded data before the image processing unit 107 can start the image processing. Accordingly, it is possible to prevent deterioration in image quality.
 なお、判定部110が、取得倍率に加え、ズーム直前の入力画像のピクチャタイプにさらに基づいて判定を行うようにしてもよい。例えば、ズーム直前の入力画像のピクチャタイプがIピクチャ(イントラ符号化画像)である場合には、実行されるズームが前記所定の閾値に対応する基準よりも大幅なものであっても、画像処理部107に前記画像処理を実行させないと判定するようにしてもよい。また、ズーム直前の入力画像のピクチャタイプがPピクチャ(Predictive符号化画像)である場合に、ピクチャタイプがIピクチャであるときよりも許容されるフレーム間差分が小さくなるように、判定部110の判定に用いられる閾値や、制御部111により出力されるパラメータを設定してもよい。 Note that the determination unit 110 may perform determination based on the picture type of the input image immediately before zooming in addition to the acquisition magnification. For example, when the picture type of the input image immediately before zooming is an I picture (intra-coded image), image processing is performed even if the zoom to be executed is larger than the reference corresponding to the predetermined threshold value. It may be determined that the image processing is not executed by the unit 107. In addition, when the picture type of the input image immediately before zooming is a P picture (Predictive encoded image), the determination unit 110 makes the allowable inter-frame difference smaller than when the picture type is an I picture. You may set the threshold value used for determination, and the parameter output by the control part 111. FIG.
 《実施形態1の変形例1》
 本発明の実施形態1の変形例1に係るネットワークカメラ100においては、判定部110が、成分情報取得部109により取得された空間周波数成分を受け、ズームの倍率に代えて、ズーム直前の画像とズーム直後の画像との間での空間周波数成分の変化量に基づいて、画像処理部107に前記画像処理を実行させるか否かの判定を行う。詳しくは、ズーム前後の画像の空間周波数成分の変化量が所定の閾値以上である場合には、画像処理部107に画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、画像処理部107に画像処理を実行させないと判定する。
<< Variation 1 of Embodiment 1 >>
In the network camera 100 according to the first modification of the first embodiment of the present invention, the determination unit 110 receives the spatial frequency component acquired by the component information acquisition unit 109, and replaces the zoom magnification with the image immediately before zooming. Based on the amount of change in the spatial frequency component with the image immediately after zooming, it is determined whether or not the image processing unit 107 is to execute the image processing. Specifically, when the amount of change in the spatial frequency component of the image before and after zooming is equal to or greater than a predetermined threshold, it is determined that the image processing unit 107 performs image processing, while the amount of change is less than the predetermined threshold. In some cases, it is determined that the image processing unit 107 does not execute image processing.
 判定部110の判定において空間周波数成分の変化量と比較される閾値は、フレーム間差分と、ズームの倍率と、空間周波数成分の変化量との関係に基づいて設定できる。例えば、動きベクトルを求めるための探索範囲が所定の範囲を超えない拡大倍率のうちの最大の倍率でズームインを行った場合の空間周波数成分の変化量に前記閾値が設定される。 The threshold to be compared with the change amount of the spatial frequency component in the determination of the determination unit 110 can be set based on the relationship among the interframe difference, the zoom magnification, and the change amount of the spatial frequency component. For example, the threshold is set to the amount of change in the spatial frequency component when zooming in at the maximum magnification among the enlargement magnifications in which the search range for obtaining the motion vector does not exceed a predetermined range.
 また、制御部111が、空間周波数成分の変化量とズームの倍率との対応を示す対応表を保持し、推測画像の拡大・縮小倍率、すなわち、ズーム直前の画像を何倍にして推測画像を作成するかを示すパラメータを、当該対応表に基づいて決定するようにしてもよい。 In addition, the control unit 111 holds a correspondence table indicating the correspondence between the amount of change in the spatial frequency component and the zoom magnification, and enlarges / reduces the estimated image, i.e., how many times the image immediately before zooming is multiplied. A parameter indicating whether to create may be determined based on the correspondence table.
 そのほかの構成は実施形態1と同じであるので、その説明を省略する。 Other configurations are the same as those in the first embodiment, and thus description thereof is omitted.
 《実施形態1の変形例2》
 本発明の実施形態1の変形例2に係るネットワークカメラ100においては、判定部110が、成分情報取得部109により取得された輝度成分を受け、ズームの倍率に代えて、ズーム直前の画像とズーム直後の画像との間での輝度成分の変化量に基づいて、画像処理部107に前記画像処理を実行させるか否かの判定を行う。詳しくは、ズーム前後の画像の輝度成分の変化量が所定の閾値以上である場合には、画像処理部107に画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、画像処理部107に画像処理を実行させないと判定する。
<< Modification 2 of Embodiment 1 >>
In the network camera 100 according to the second modification of the first embodiment of the present invention, the determination unit 110 receives the luminance component acquired by the component information acquisition unit 109, and instead of the zoom magnification, the image immediately before zooming and the zoom It is determined whether or not the image processing unit 107 is to execute the image processing based on the change amount of the luminance component with the immediately following image. Specifically, if the amount of change in the luminance component of the image before and after zooming is greater than or equal to a predetermined threshold, the image processing unit 107 determines that image processing is to be performed, while the amount of change is less than the predetermined threshold. In this case, it is determined that the image processing unit 107 does not execute image processing.
 判定部110の判定において輝度成分の変化量と比較される閾値は、フレーム間差分と、ズームの倍率と、輝度成分の変化量との関係に基づいて設定できる。例えば、動きベクトルを求めるための探索範囲が所定の範囲を超えない拡大倍率のうちの最大の倍率でズームインを行った場合の輝度成分の変化量に前記閾値を設定してもよい。 The threshold value to be compared with the change amount of the luminance component in the determination by the determination unit 110 can be set based on the relationship among the interframe difference, the zoom magnification, and the change amount of the luminance component. For example, the threshold value may be set to the amount of change in the luminance component when zooming in at the maximum magnification among the enlargement magnifications in which the search range for obtaining the motion vector does not exceed a predetermined range.
 また、制御部111が輝度成分の変化量とズームの倍率との対応を示す対応表を保持し、推測画像の拡大・縮小倍率、すなわち、ズーム直前の画像を何倍にして推測画像を作成するかを示すパラメータを、当該対応表に基づいて決定するようにしてもよい。 In addition, the control unit 111 holds a correspondence table indicating the correspondence between the change amount of the luminance component and the zoom magnification, and creates an estimated image by multiplying the magnification / reduction magnification of the estimated image, that is, the image immediately before zooming. A parameter indicating such may be determined based on the correspondence table.
 そのほかの構成は実施形態1と同じであるので、その説明を省略する。 Other configurations are the same as those in the first embodiment, and thus description thereof is omitted.
 《実施形態1の変形例3》
 本発明の実施形態1の変形例2に係るネットワークカメラ100においては、ズーム時に、画像処理部107が、ズーム直前の画像とズーム直後の画像との間に推測画像を挿入する挿入処理に代えて、ズーム直前の画像を前記推測画像に置換する置換処理を行う。
<< Modification 3 of Embodiment 1 >>
In the network camera 100 according to the second modification of the first embodiment of the present invention, instead of the insertion process in which the image processing unit 107 inserts a guess image between the image immediately before zooming and the image immediately after zooming when zooming. Then, a replacement process for replacing the image immediately before zooming with the estimated image is performed.
 そのほかの構成は実施形態1と同じであるので、その説明を省略する。  Other configurations are the same as those in the first embodiment, and thus description thereof is omitted. *
 なお、前記置換処理に代えて、ズーム直後の画像を前記推測画像に置換する置換処理が行われるようにしてもよい。  In place of the replacement process, a replacement process for replacing the image immediately after zooming with the estimated image may be performed. *
 《実施形態1の変形例4》
 本発明の実施形態1の変形例4に係るネットワークカメラ100においては、画像処理部107が、ズーム時に前記入力画像に対して上記画像処理を実行する機能に代えて、パンの実行時及びチルトの実行時に前記入力画像に対して以下の画像処理を実行する機能を備えている。
<< Modification 4 of Embodiment 1 >>
In the network camera 100 according to the fourth modification of the first embodiment of the present invention, the image processing unit 107 replaces the function of executing the image processing with respect to the input image at the time of zooming, while performing panning and tilting. A function of executing the following image processing on the input image at the time of execution is provided.
 パン又はチルトの実行時に施される画像処理は、パン又はチルトの実行直前の画像を、パン又はチルトの移動距離よりも短い移動距離分移動させた画像を推測画像として作成し、パン又はチルトの実行直前の画像とパン又はチルトの実行直後の画像との間に挿入する挿入処理を前記入力画像に対して実行する処理である。例えば、画像処理部107は、パン又はチルトの実行直前の画像の所定位置からの移動量が1であり、パン又はチルトの実行直後の画像の所定位置からの移動量が4である場合に、所定位置からの移動量が2である画像を推測画像としてパン又はチルトの実行前後の画像の間に挿入する。 Image processing that is performed at the time of panning or tilting creates an estimated image by moving the image immediately before panning or tilting by a moving distance shorter than the moving distance of panning or tilting. In this process, an insertion process is performed for the input image, which is inserted between an image immediately before execution and an image immediately after panning or tilting. For example, when the movement amount from a predetermined position of the image immediately before execution of panning or tilting is 1 and the movement amount from the predetermined position of the image immediately after execution of panning or tilting is 4, An image having a movement amount of 2 from a predetermined position is inserted as an estimated image between images before and after execution of panning or tilting.
 また、判定部110は、パン又はチルトの移動距離が所定の閾値以上である場合に画像処理部107に画像処理を実行させると判定する一方、前記移動距離が前記所定の閾値未満である場合に画像処理部107に画像処理を実行させないと判定する。 In addition, the determination unit 110 determines that the image processing unit 107 performs image processing when the pan or tilt movement distance is equal to or greater than a predetermined threshold, and when the movement distance is less than the predetermined threshold. It is determined that the image processing unit 107 does not execute image processing.
 そのほかの構成は実施形態1と同じであるので、その説明を省略する。 Other configurations are the same as those in the first embodiment, and thus description thereof is omitted.
 本変形例によると、動きベクトルを求めるための探索範囲が所定の探索範囲を超えないように判定部110の閾値を設定することにより、画像符号化部108に入力される画像が画像処理部107によって制御され、符号化データの符号量を一定の範囲内に抑えることができる。また、推測画像として挿入された画像に対しても、再度同様のステップを実行することにより、符号化データの符号量を安定させることができる。 According to this modification, by setting the threshold value of the determination unit 110 so that the search range for obtaining a motion vector does not exceed a predetermined search range, an image input to the image encoding unit 108 is converted into an image processing unit 107. Therefore, the code amount of the encoded data can be suppressed within a certain range. Also, the code amount of the encoded data can be stabilized by executing the same steps again on the image inserted as the estimated image.
 なお、本変形例4においても、上記変形例1と同様に、判定部110が、成分情報取得部109により取得された空間周波数成分を受け、前記パン又はチルトの移動距離に代えて、パン又はチルトの直前の画像とパン又はチルトの直後の画像との間での空間周波数成分の変化量に基づいて、画像処理部107に本変形例4の画像処理を実行させるか否かの判定を行うようにしてもよい。詳しくは、判定部110が、パン又はチルトの前後の画像の空間周波数成分の変化量が所定の閾値以上である場合には、画像処理部107に本変形例4の画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、画像処理部107に本変形例4の画像処理を実行させないと判定してもよい。 In the fourth modification, as in the first modification, the determination unit 110 receives the spatial frequency component acquired by the component information acquisition unit 109, and instead of the pan or tilt movement distance, Based on the amount of change in the spatial frequency component between the image immediately before the tilt and the image immediately after the pan or tilt, it is determined whether or not the image processing unit 107 is to execute the image processing of the fourth modification. You may do it. Specifically, the determination unit 110 determines that the image processing unit 107 performs the image processing of the fourth modification when the amount of change in the spatial frequency component of the image before and after panning or tilting is equal to or greater than a predetermined threshold. On the other hand, if the amount of change is less than the predetermined threshold, it may be determined that the image processing unit 107 does not execute the image processing of the fourth modification.
 さらに、本変形例4においても、上記変形例2と同様に、判定部110が、前記パン又はチルトの移動距離に代えて、パン又はチルトの直前の画像とパン又はチルトの直後の画像との間での輝度成分の変化量に基づいて、画像処理部107に本変形例4の画像処理を実行させるか否かの判定を行うようにしてもよい。詳しくは、判定部110が、パン又はチルトの前後の画像の輝度成分の変化量が所定の閾値以上である場合には、画像処理部107に本変形例4の画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、画像処理部107に本変形例4の画像処理を実行させないと判定してもよい。 Further, also in the fourth modification, as in the second modification, the determination unit 110 replaces the pan or tilt movement distance with an image immediately before pan or tilt and an image immediately after pan or tilt. Whether or not to cause the image processing unit 107 to execute the image processing of the fourth modification may be determined based on the amount of change in the luminance component between them. Specifically, the determination unit 110 determines that the image processing unit 107 performs the image processing of the fourth modification when the change amount of the luminance component of the image before and after panning or tilting is equal to or greater than a predetermined threshold. On the other hand, when the amount of change is less than the predetermined threshold, it may be determined that the image processing unit 107 does not execute the image processing of the fourth modification.
 また、本変形例4においても、パン又はチルトの実行時に、画像処理部107が、パン又はチルトの実行直前の画像と実行直後の画像との間に推測画像を挿入する挿入処理に代えて、パン又はチルトの実行直前の画像を前記推測画像に置換する置換処理、又はパン又はチルトの実行直後の画像を前記推測画像に置換する置換処理を行ってもよい。 Also in the fourth modification, instead of the insertion process in which the image processing unit 107 inserts the estimated image between the image immediately before the execution of panning or tilting and the image immediately after the execution when panning or tilting is executed, A replacement process for replacing an image immediately before execution of pan or tilt with the estimated image, or a replacement process for replacing an image immediately after execution of pan or tilt with the estimated image may be performed.
 《実施形態2》
 本発明の実施形態2に係るカメラ400は、図4に示すように、実施形態1の画像符号化装置104に代えて、画像符号化装置404を備えている。画像符号化装置404は、画像符号化装置104の構成に加え、テーブル記憶部401を備え、判定部110及び制御部111に代えて、判定部410及び制御部411を備えている。
<< Embodiment 2 >>
As shown in FIG. 4, the camera 400 according to the second embodiment of the present invention includes an image encoding device 404 instead of the image encoding device 104 according to the first embodiment. In addition to the configuration of the image encoding device 104, the image encoding device 404 includes a table storage unit 401, and includes a determination unit 410 and a control unit 411 instead of the determination unit 110 and the control unit 111.
 テーブル記憶部401は、成分情報取得部109により取得される前記動作(ズーム)前後の画像の空間周波数成分の変化量と、画像処理部107に対して行う制御方法との対応を示す閾値テーブルを記憶する。 The table storage unit 401 includes a threshold value table indicating correspondence between the amount of change in the spatial frequency component of the image before and after the operation (zoom) acquired by the component information acquisition unit 109 and the control method performed on the image processing unit 107. Remember.
 判定部410は、成分情報取得部109により取得される成分情報が変化した場合に、テーブル記憶部401に記憶された閾値テーブルから、成分情報取得部109により入力動画像データに基づいて取得される前記動作前後の画像の空間周波数成分の変化量に対応する制御方法を特定する。 When the component information acquired by the component information acquisition unit 109 changes, the determination unit 410 is acquired from the threshold table stored in the table storage unit 401 based on the input moving image data by the component information acquisition unit 109. A control method corresponding to the amount of change in the spatial frequency component of the image before and after the operation is specified.
 制御部411、判定部410により特定された制御方法で画像処理部107に対する制御を行う。 Control is performed on the image processing unit 107 by the control method specified by the control unit 411 and the determination unit 410.
 例えば、閾値テーブルが図5に示すようなものである場合、制御部411は、前記動作前後の画像の空間周波数成分の変化量がa未満である場合には、画像処理部107に前記画像処理を実行させない第2の制御を行う。また、制御部411は、空間周波数成分の変化量がa以上b未満である場合には、画像処理部107に前記画像処理を実行させる第1の制御を行うとともに推測画像の倍率として3を示すパラメータを出力する。また、制御部411は、空間周波数成分の変化量がb以上c未満である場合には、画像処理部107に前記画像処理を実行させる第1の制御を行うとともに推測画像の倍率として5を示すパラメータを出力する。さらに、制御部411は、空間周波数成分の変化量がc以上d未満である場合には、画像処理部107に前記画像処理を実行させる第1の制御を行うとともに推測画像の倍率として7を示すパラメータを出力する。また、制御部411は、空間周波数成分の変化量がd以上である場合には、画像処理部107に前記画像処理を実行させる第1の制御を行うとともに推測画像の倍率として9を示すパラメータを出力する。図5の閾値テーブルは、前記動作前後の画像の空間周波数成分の変化量と、前記第1及び第2の制御の一方との対応を示している。 For example, when the threshold table is as shown in FIG. 5, the control unit 411 causes the image processing unit 107 to perform the image processing when the amount of change in the spatial frequency component of the image before and after the operation is less than a. The second control that does not execute is performed. Further, when the change amount of the spatial frequency component is greater than or equal to a and less than b, the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and indicates 3 as the magnification of the estimated image. Output parameters. Further, when the change amount of the spatial frequency component is not less than b and less than c, the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and indicates 5 as the magnification of the estimated image. Output parameters. Further, when the change amount of the spatial frequency component is not less than c and less than d, the control unit 411 performs first control for causing the image processing unit 107 to execute the image processing and indicates 7 as the magnification of the estimated image. Output parameters. In addition, when the change amount of the spatial frequency component is equal to or greater than d, the control unit 411 performs the first control for causing the image processing unit 107 to execute the image processing and sets a parameter indicating 9 as the magnification of the estimated image. Output. The threshold value table of FIG. 5 shows the correspondence between the amount of change in the spatial frequency component of the image before and after the operation and one of the first and second controls.
 図6は、ズームの実行時における画像符号化装置404の動作を示すフローチャートである。図6において、図2と同じ処理については同一の符号を付してその説明を省略する。 FIG. 6 is a flowchart showing the operation of the image encoding device 404 during zooming. 6, the same processes as those in FIG. 2 are denoted by the same reference numerals, and the description thereof is omitted.
 (S6002)では、判定部410が、(S2001)において成分情報取得部109により取得された成分情報に変化があるか否かを判定する。変化がある場合には処理が(S6003)に進む一方、変化がない場合には処理が(S2007)に進む。 (S6002), the determination unit 410 determines whether there is a change in the component information acquired by the component information acquisition unit 109 in (S2001). If there is a change, the process proceeds to (S6003), whereas if there is no change, the process proceeds to (S2007).
 (S6003)では、判定部410が、テーブル記憶部401に記憶された閾値テーブルから、成分情報取得部109により入力動画像データに基づいて取得される空間周波数成分に対応する制御方法を特定する。 In (S6003), the determination unit 410 specifies a control method corresponding to the spatial frequency component acquired by the component information acquisition unit 109 based on the input moving image data from the threshold value table stored in the table storage unit 401.
 (S6004)制御部411が、(S6003)において特定された制御方法で画像処理部107に対する制御を行う。制御部411は、第1の制御を行う場合、推測画像の倍率を示すパラメータを出力する。画像処理部107は、制御部411による制御に応じて動作する。画像処理部107は、前記画像処理を実行する場合、ズーム直前の画像に対し、制御部411により出力されたパラメータにより示される倍率で拡大を実行した画像を前記推測画像として生成する。 (S6004) The control unit 411 controls the image processing unit 107 by the control method specified in (S6003). When performing the first control, the control unit 411 outputs a parameter indicating the magnification of the estimated image. The image processing unit 107 operates according to control by the control unit 411. When executing the image processing, the image processing unit 107 generates an image obtained by enlarging the image immediately before zooming with the magnification indicated by the parameter output by the control unit 411 as the estimated image.
 (S6005)(S6004)において制御部411が第1の制御を行った場合には、画像符号化部108が、(S6004)における画像処理後の動画像を示す動画像データに対して符号化処理を行って符号化データを出力する。一方、(S6004)において制御部411が第2の制御を行った場合には、画像符号化部108が、入力動画像データに対して符号化処理を行って符号化データを出力する。 (S6005) When the control unit 411 performs the first control in (S6004), the image encoding unit 108 encodes the moving image data indicating the moving image after the image processing in (S6004). To output encoded data. On the other hand, when the control unit 411 performs the second control in (S6004), the image encoding unit 108 performs an encoding process on the input moving image data and outputs encoded data.
 図7中、(i)は、被写体1についてズームの倍率(ステップ数)と空間周波数成分の変化量との関係を示すグラフ、(ii)は、被写体2についてズームの倍率と空間周波数成分の変化量との関係を示すグラフ、(iii)は、被写体3についてズームの倍率と空間周波数成分の変化量との関係を示すグラフである。このように、ズームの倍率と空間周波数成分の変化量との関係は、被写体によって異なる場合があるため、テーブル記憶部401が閾値テーブルを複数記憶し、判定部410が、制御方法の特定に使用する閾値テーブルを、成分情報取得部109により取得された成分情報(画像入力条件)に応じて選択するようにしてもよい。 7, (i) is a graph showing the relationship between the zoom magnification (number of steps) and the amount of change in the spatial frequency component for the subject 1, and (ii) is the change in zoom magnification and the spatial frequency component for the subject 2. A graph showing the relationship with the amount, (iii) is a graph showing the relationship between the zoom magnification and the amount of change in the spatial frequency component for the subject 3. Thus, since the relationship between the zoom magnification and the amount of change in the spatial frequency component may vary depending on the subject, the table storage unit 401 stores a plurality of threshold tables, and the determination unit 410 uses the control method for specifying the control method. The threshold table to be selected may be selected according to the component information (image input condition) acquired by the component information acquisition unit 109.
 なお、本実施形態においては、制御方法を特定するために、空間周波数成分の変化量と制御方法との対応を示す閾値テーブルを使用したが、輝度成分の変化量と制御方法との対応を示す閾値テーブルを代わりに使用してもよい。 In this embodiment, the threshold table indicating the correspondence between the change amount of the spatial frequency component and the control method is used to specify the control method. However, the correspondence between the change amount of the luminance component and the control method is shown. A threshold table may be used instead.
 また、最適な画像処理方法は前回決定された制御値の近辺に存在することが予想される。したがって、符号化データの符号量、周波数成分情報、ピクチャタイプ等の入力画像処理結果が、得られる度に閾値テーブルに反映されるようにしてもよい。つまり、閾値テーブルが、得られた符号化データに基づいて更新されるようにしてもよい。 Also, the optimal image processing method is expected to exist in the vicinity of the control value determined last time. Therefore, the input image processing result such as the code amount of encoded data, frequency component information, and picture type may be reflected in the threshold value table every time it is obtained. That is, the threshold table may be updated based on the obtained encoded data.
 《実施形態3》
 本発明の実施形態3に係るカメラ800は、図8に示すように、実施形態2の画像符号化装置404に代えて、画像符号化装置804を備えている。画像符号化装置804は、画像符号化装置404の構成に加え、閾値学習部801を備えている。
<< Embodiment 3 >>
As shown in FIG. 8, the camera 800 according to the third embodiment of the present invention includes an image encoding device 804 instead of the image encoding device 404 according to the second embodiment. The image encoding device 804 includes a threshold learning unit 801 in addition to the configuration of the image encoding device 404.
 この閾値学習部801は、制御方法(画像処理条件)と符号化データの符号量との関係を確認し、閾値テーブルに反映させる。詳しくは、符号化データの符号量が所定の閾値以下となった場合に、そのときの空間周波数成分の変化量とズームの倍率(推測画像の倍率)とを、互いに対応付けて閾値テーブルに書き込む。そのときの空間周波数成分の変化量が、すでに所定のズームの倍率と対応付けられて閾値テーブルに記憶されている場合には、そのときのズームの倍率をその所定のズームの倍率に上書きする。一方、そのときの空間周波数成分の変化量が、閾値テーブルに記憶されていない場合には、そのときの空間周波数成分の変化量とズームの倍率とを互いに対応付けて新たに閾値テーブルに書き込む。 The threshold learning unit 801 confirms the relationship between the control method (image processing conditions) and the code amount of the encoded data, and reflects the relationship in the threshold table. Specifically, when the code amount of the encoded data is equal to or less than a predetermined threshold value, the change amount of the spatial frequency component and the zoom factor (magnification of the estimated image) at that time are associated with each other and written in the threshold value table. . If the change amount of the spatial frequency component at that time is already associated with a predetermined zoom magnification and stored in the threshold value table, the zoom magnification at that time is overwritten on the predetermined zoom magnification. On the other hand, if the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the zoom magnification are newly written in the threshold value table in association with each other.
 図9は、ズームの実行時における画像符号化装置804の動作を示すフローチャートである。図9において、図6と同じ処理については同一の符号を付してその説明を省略する。 FIG. 9 is a flowchart showing the operation of the image encoding device 804 during zooming. 9, the same processes as those in FIG. 6 are denoted by the same reference numerals, and the description thereof is omitted.
 (S2007)において画像符号化部108が入力動画像データに対して符号化処理を行った後、(S9001)において、判定部410が符号化データの符号量を確認する。このとき、符号量が所定の閾値以下である場合には処理が(S9005)に進み、符号量が所定の閾値を超えている場合には処理が(S9002)に進む。 After the image encoding unit 108 performs the encoding process on the input moving image data in (S2007), the determination unit 410 confirms the code amount of the encoded data in (S9001). At this time, if the code amount is equal to or smaller than the predetermined threshold, the process proceeds to (S9005), and if the code amount exceeds the predetermined threshold, the process proceeds to (S9002).
 (S9002)においては、制御部411が、画像処理部107に対して画像処理を要求し、画像処理部107が、画像符号化部108の入力データを生成する。このとき判定部410が、テーブル記憶部401に記憶された閾値テーブルから、成分情報取得部109により入力動画像データに基づいて取得される空間周波数成分に対応する制御方法を特定し、特定された制御方法で制御部411が画像処理部107に対する制御を行う。そして、処理が(S9003)に進む。 (S9002), the control unit 411 requests the image processing unit 107 to perform image processing, and the image processing unit 107 generates input data of the image encoding unit 108. At this time, the determination unit 410 specifies the control method corresponding to the spatial frequency component acquired from the threshold information stored in the table storage unit 401 based on the input moving image data by the component information acquisition unit 109, and is specified. The control unit 411 controls the image processing unit 107 by the control method. Then, the process proceeds to (S9003).
 (S9003)においては、(S9002)で生成された入力データに対し、画像符号化部108が符号化処理を行う。そして、処理が(S9004)に進む。 In (S9003), the image encoding unit 108 performs an encoding process on the input data generated in (S9002). Then, the process proceeds to (S9004).
 (S9004)においては、判定部410が(S9003)において生成された符号化データの符号量を所定の閾値と比較する。そして、上記符号量が所定の閾値以下であれば処理が(S9006)に進む一方、所定の閾値を超えている場合には処理が(S9007)に進む。 In (S9004), the determination unit 410 compares the code amount of the encoded data generated in (S9003) with a predetermined threshold value. If the code amount is equal to or less than the predetermined threshold, the process proceeds to (S9006). If the code amount exceeds the predetermined threshold, the process proceeds to (S9007).
 (S9005)においては、(S2001)で取得した空間周波数成分の変化量に、制御方法として上記第2の制御を対応付けて閾値テーブルに書き込む。そのときの空間周波数成分の変化量が、すでに所定の制御方法と対応付けられて閾値テーブルに記憶されている場合には、制御方法として第2の制御を上書きする。一方、そのときの空間周波数成分の変化量が、閾値テーブルに記憶されていない場合には、そのときの空間周波数成分の変化量と第2の制御とを互いに対応付けて新たに閾値テーブルに書き込む。 In (S9005), the change amount of the spatial frequency component acquired in (S2001) is associated with the second control as a control method and written in the threshold value table. When the change amount of the spatial frequency component at that time is already associated with a predetermined control method and stored in the threshold value table, the second control is overwritten as the control method. On the other hand, if the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the second control are newly associated with each other and written to the threshold value table. .
 (S9006)においては、閾値学習部801が、(S2001)で取得した空間周波数成分の変化量に、直前の画像処理(S9002)における制御方法(ステップ数(推測画像の倍率))を対応付けて閾値テーブルに書き込む。そのときの空間周波数成分の変化量が、すでに所定の制御方法と対応付けられて閾値テーブルに記憶されている場合には、制御方法のみを上書きする。一方、そのときの空間周波数成分の変化量が、閾値テーブルに記憶されていない場合には、そのときの空間周波数成分の変化量と直前の画像処理の制御方法とを互いに対応付けて新たに閾値テーブルに書き込む。 In (S9006), the threshold learning unit 801 associates the change amount of the spatial frequency component acquired in (S2001) with the control method (number of steps (magnification of the estimated image)) in the immediately preceding image processing (S9002). Write to threshold table. If the change amount of the spatial frequency component at that time is already associated with a predetermined control method and stored in the threshold value table, only the control method is overwritten. On the other hand, if the change amount of the spatial frequency component at that time is not stored in the threshold value table, the change amount of the spatial frequency component at that time and the control method of the immediately preceding image processing are associated with each other and a new threshold value is obtained. Write to the table.
 (S9007)において、制御部411が、次回の符号化(S9003)により得られる符号化データの符号量が、前回の符号化により得られた符号化データの符号量よりも減少するように、次回の画像処理(S9002)における制御方法(推測画像の倍率)を設定する。 In (S9007), the control unit 411 next time so that the code amount of the encoded data obtained by the next encoding (S9003) is smaller than the code amount of the encoded data obtained by the previous encoding. The control method (magnification of estimated image) in the image processing (S9002) is set.
 そのほかの構成は実施形態2と同じであるので、その説明を省略する。 Other configurations are the same as those of the second embodiment, and thus description thereof is omitted.
 本実施形態によれば、符号化結果に基づいて符号量を最適化するよう処理が行われるため、よりシーンに適した符号化処理を行うことが可能となる。 According to the present embodiment, since the process is performed so as to optimize the code amount based on the encoding result, the encoding process more suitable for the scene can be performed.
 なお、上記実施形態1~3及びそれらの変形例は、本発明の主旨の範囲内で様々な改変が可能である。 It should be noted that the first to third embodiments and their modifications can be variously modified within the scope of the present invention.
 例えば、実施形態3において、(S9002)~(S9004)の処理を複数回行うことが可能であったが、処理簡略化のため、1回だけ行われるようにしてもよい。 For example, in the third embodiment, the processes of (S9002) to (S9004) can be performed a plurality of times, but may be performed only once to simplify the process.
 また、上記実施形態2,3及び当該実施形態2,3の上述の変形は、上記実施形態1の変形例3,4にも適用できる。 Further, the above-described modifications of Embodiments 2 and 3 and Embodiments 2 and 3 can be applied to Modifications 3 and 4 of Embodiment 1 described above.
 本発明に係る画像符号化装置、及びカメラは、撮像により得られた入力画像を示す入力動画像データに対して符号化を行う画像符号化装置、及び当該画像符号化装置を備えたカメラとして有用であり、例えば、デジタルカメラやネットワークカメラシステムに利用できる。 INDUSTRIAL APPLICABILITY The image encoding device and the camera according to the present invention are useful as an image encoding device that encodes input moving image data indicating an input image obtained by imaging, and a camera including the image encoding device. For example, it can be used for a digital camera or a network camera system.
100,400,800   ネットワークカメラ(カメラ) 
101   撮像素子 
103   動画像データ生成部 
104,404,804   画像符号化装置 
106   伝送装置 
107   画像処理部 
108   画像符号化部 
110,410   判定部 
111,411   制御部 
401   テーブル記憶部 
100,400,800 Network camera (camera)
101 Image sensor
103 moving image data generation unit
104, 404, 804 Image coding apparatus
106 Transmission equipment
107 Image processing unit
108 Image encoding unit
110,410 determination unit
111,411 control unit
401 Table storage unit

Claims (13)

  1.  ズーム、パン、及びチルトのうちの1つの動作をカメラが実行するときに、当該カメラの撮像により得られた入力画像を示す入力動画像データに対し、フレーム間予測を利用した符号化を行う画像符号化装置において、
     前記動作を所定の度合で実行したときの前記入力画像に対し、前記所定の度合よりも低い度合で前記動作が実行された場合の推測画像を前記動作直前の画像と前記動作直後の画像との間に挿入する挿入処理、及び前記動作直前の画像又は前記動作直後の画像を前記推測画像に置換する置換処理のうちのいずれか一方の画像処理を実行する画像処理部と、
     前記画像処理部による画像処理後の動画像を示す動画像データに対して符号化を行って符号化データを出力する画像符号化部とを備え、
     前記動作がズームであるとき、前記度合は倍率であり、
     前記動作がパン又はチルトであるとき、前記度合は移動距離であることを特徴とする画像符号化装置。
    When the camera performs one of the operations of zoom, pan, and tilt, an image that performs encoding using inter-frame prediction on input moving image data indicating an input image obtained by imaging of the camera In the encoding device,
    For the input image when the operation is executed at a predetermined degree, an estimated image when the operation is executed at a lower degree than the predetermined degree is an estimated image between the image immediately before the operation and the image immediately after the operation. An image processing unit that executes image processing of any one of insertion processing to be inserted in between and replacement processing for replacing the image immediately before the operation or the image immediately after the operation with the estimated image;
    An image encoding unit that encodes moving image data indicating a moving image after image processing by the image processing unit and outputs encoded data;
    When the action is zoom, the degree is a magnification,
    The image coding apparatus according to claim 1, wherein when the operation is panning or tilting, the degree is a moving distance.
  2.  請求項1に記載の画像符号化装置において、
     前記画像処理部に前記画像処理を実行させるか否かの判定を行う判定部と、
     前記判定部による判定結果に基づいて、前記画像処理を実行させる第1の制御、及び前記画像処理を実行させない第2の制御のいずれか一方を前記画像処理部に対して行う制御部とを備えていることを特徴とする画像符号化装置。
    The image encoding device according to claim 1,
    A determination unit for determining whether to cause the image processing unit to execute the image processing;
    A control unit that performs, on the image processing unit, one of a first control that executes the image processing and a second control that does not execute the image processing based on a determination result by the determination unit. An image encoding apparatus characterized by comprising:
  3.  請求項2に記載の画像符号化装置において、
     前記動作はズームであり、
     前記判定部は、前記ズームの倍率に基づいて、前記ズームが所定の基準よりも大幅な拡大又は縮小である場合には、前記画像処理部に前記画像処理を実行させると判定する一方、前記ズームが前記所定の基準よりも大幅な拡大又は縮小でない場合には、前記画像処理部に前記画像処理を実行させないと判定することを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    The action is zoom;
    The determination unit determines to cause the image processing unit to execute the image processing when the zoom is larger or smaller than a predetermined reference based on the zoom magnification. Is not larger or smaller than the predetermined reference, it is determined that the image processing unit does not execute the image processing.
  4.  請求項2に記載の画像符号化装置において、
     前記動作はパン又はチルトであり、
     前記判定部は、前記パン又はチルトの移動距離が所定の閾値以上である場合には、前記画像処理部に前記画像処理を実行させると判定する一方、前記移動距離が前記所定の閾値未満である場合には、前記画像処理部に前記画像処理を実行させないと判定することを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    The movement is panning or tilting,
    The determination unit determines that the image processing unit performs the image processing when the pan or tilt movement distance is equal to or greater than a predetermined threshold, while the movement distance is less than the predetermined threshold. In this case, it is determined that the image processing unit does not execute the image processing.
  5.  請求項2に記載の画像符号化装置において、
     前記判定部は、前記動作前後の画像の空間周波数成分の変化量が所定の閾値以上である場合には、前記画像処理部に前記画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、前記画像処理部に前記画像処理を実行させないと判定することを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    The determination unit determines that the image processing unit is to execute the image processing when the change amount of the spatial frequency component of the image before and after the operation is equal to or greater than a predetermined threshold value, and the change amount is the predetermined value. The image encoding apparatus determines that the image processing unit does not execute the image processing when the image processing unit is less than the threshold value.
  6.  請求項2に記載の画像符号化装置において、
     前記判定部は、前記動作前後の画像の輝度成分の変化量が所定の閾値以上である場合には、前記画像処理部に前記画像処理を実行させると判定する一方、前記変化量が前記所定の閾値未満である場合には、前記画像処理部に前記画像処理を実行させないと判定することを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    The determination unit determines that the image processing unit is to execute the image processing when the change amount of the luminance component of the image before and after the operation is equal to or greater than a predetermined threshold value, while the change amount is the predetermined value. An image encoding apparatus, wherein if it is less than a threshold value, it is determined that the image processing unit is not allowed to execute the image processing.
  7.  請求項2に記載の画像符号化装置において、
     動画像データに基づいて得られる前記動作前後の画像の成分情報の変化量と、前記第1及び第2の制御の一方との対応を示す閾値テーブルを記憶するテーブル記憶部をさらに備え、
     前記判定部は、前記テーブル記憶部に記憶された閾値テーブルを参照し、前記入力動画像データに基づいて得られた前記動作前後の画像の成分情報の変化量に基づいて、前記第1及び第2の制御の一方を特定し、
     前記制御部は、前記判定部による特定結果に基づいて、前記第1及び第2の制御の一方を実行することを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    A table storage unit that stores a threshold table indicating a correspondence between a change amount of component information of the image before and after the operation obtained based on moving image data and one of the first and second controls;
    The determination unit refers to the threshold value table stored in the table storage unit, and based on the amount of change in the component information of the image before and after the operation obtained based on the input moving image data, the first and the first Identify one of the two controls,
    The said control part performs one of said 1st and 2nd control based on the specific result by the said determination part, The image coding apparatus characterized by the above-mentioned.
  8.  請求項7に記載の画像符号化装置において、
     前記データ記憶部は、複数種類の閾値テーブルを記憶し、
     前記判定部は、前記入力動画像データに基づいて得られた画像入力条件に基づいて、前記第1又は第2の制御の特定に使用する閾値テーブルを前記複数種類の閾値テーブルから選択することを特徴とする画像符号化装置。
    The image encoding device according to claim 7, wherein
    The data storage unit stores a plurality of types of threshold tables,
    The determination unit selects from the plurality of types of threshold tables a threshold table used for specifying the first or second control based on an image input condition obtained based on the input moving image data. An image encoding device.
  9.  請求項7に記載の画像符号化装置において、
     前記データ記憶部に記憶される閾値テーブルは、前記画像符号化部により出力される符号化データに基づいて更新されることを特徴とする画像符号化装置。
    The image encoding device according to claim 7, wherein
    The threshold value table stored in the data storage unit is updated based on encoded data output by the image encoding unit.
  10.  請求項1に記載の画像符号化装置において、
     前記動作はズームであり、
     当該画像符号化装置は、
     動画像データに基づいて得られる前記動作前後の画像の空間周波数成分の変化量と、前記推測画像の倍率との対応を示す閾値テーブルを記憶するテーブル記憶部と、
     前記画像符号化部により出力される符号化データの符号量が所定の閾値以下になる場合における前記推測画像の倍率を、前記空間周波数成分の変化量に対応付けて前記閾値テーブルに書き込む閾値学習部と、
     前記テーブル記憶部に記憶された閾値テーブルを参照し、前記入力動画像データに基づいて得られた動作前後の画像の空間周波数成分の変化量に基づいて、前記推測画像の倍率を特定する判定部と、
     前記判定部により特定された前記推測画像の倍率を示すパラメータを出力する制御部とをさらに備え、
     前記画像処理部は、前記動作直前の画像に対して前記制御部により出力されたパラメータにより示される倍率で拡大を実行した画像を、前記推測画像として生成することを特徴とする画像符号化装置。
    The image encoding device according to claim 1,
    The action is zoom;
    The image encoding device
    A table storage unit that stores a threshold table indicating a correspondence between a change amount of a spatial frequency component of the image before and after the operation obtained based on moving image data and a magnification of the estimated image;
    Threshold learning unit that writes the magnification of the estimated image when the code amount of the encoded data output by the image encoding unit is equal to or less than a predetermined threshold value to the threshold table in association with the amount of change in the spatial frequency component When,
    A determination unit that refers to a threshold value table stored in the table storage unit and identifies a magnification of the estimated image based on a change amount of a spatial frequency component of an image before and after an operation obtained based on the input moving image data When,
    A control unit that outputs a parameter indicating the magnification of the estimated image identified by the determination unit;
    The image encoding device, wherein the image processing unit generates an image obtained by enlarging an image immediately before the operation at a magnification indicated by a parameter output by the control unit, as the estimated image.
  11.  請求項2に記載の画像符号化装置において、
     前記判定部は、入力画像のピクチャタイプに基づいて前記判定を行うことを特徴とする画像符号化装置。
    The image encoding device according to claim 2,
    The image coding apparatus according to claim 1, wherein the determination unit performs the determination based on a picture type of an input image.
  12.  請求項1に記載の画像符号化装置において、
     前記動作はズームであり、
     前記画像処理部は、前記動作直前の画像の一部又は全部に対し、拡大又は縮小を行うことにより、前記推測画像を作成することを特徴とする画像符号化装置。
    The image encoding device according to claim 1,
    The action is zoom;
    The image encoding device, wherein the image processing unit creates the estimated image by enlarging or reducing a part or all of an image immediately before the operation.
  13.  請求項1~12のうちのいずれか1項に記載の画像符号化装置と、
     被写体からの光の量を電気信号に変換して出力する撮像素子と、
     前記撮像素子により出力された電気信号に基づいて、前記入力動画像データを生成する動画像データ生成部と、
     前記画像符号化装置の画像符号化部により出力された符号化データを伝送する伝送装置とを備えていることを特徴とするカメラ。
    An image encoding device according to any one of claims 1 to 12,
    An image sensor that converts the amount of light from the subject into an electrical signal and outputs the electrical signal;
    A moving image data generation unit that generates the input moving image data based on an electrical signal output by the image sensor;
    A camera comprising: a transmission device that transmits encoded data output by an image encoding unit of the image encoding device.
PCT/JP2011/003835 2010-07-15 2011-07-05 Image encoding device and camera WO2012008117A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-160983 2010-07-15
JP2010160983 2010-07-15

Publications (1)

Publication Number Publication Date
WO2012008117A1 true WO2012008117A1 (en) 2012-01-19

Family

ID=45469136

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/003835 WO2012008117A1 (en) 2010-07-15 2011-07-05 Image encoding device and camera

Country Status (1)

Country Link
WO (1) WO2012008117A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0686137A (en) * 1992-09-02 1994-03-25 Fuji Photo Optical Co Ltd Remote controller for television camera
JP2006094058A (en) * 2004-09-22 2006-04-06 Nikon Corp Image processing apparatus, program, and method for pre-process to reproduce stationary image as dynamic image
JP2007166583A (en) * 2005-11-16 2007-06-28 Canon Inc Video distribution apparatus and viewer apparatus
JP2007166022A (en) * 2005-12-09 2007-06-28 Mitsubishi Electric Corp System and device for displaying image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0686137A (en) * 1992-09-02 1994-03-25 Fuji Photo Optical Co Ltd Remote controller for television camera
JP2006094058A (en) * 2004-09-22 2006-04-06 Nikon Corp Image processing apparatus, program, and method for pre-process to reproduce stationary image as dynamic image
JP2007166583A (en) * 2005-11-16 2007-06-28 Canon Inc Video distribution apparatus and viewer apparatus
JP2007166022A (en) * 2005-12-09 2007-06-28 Mitsubishi Electric Corp System and device for displaying image

Similar Documents

Publication Publication Date Title
EP1607909A1 (en) Moving image coding apparatus
JP4682990B2 (en) Camera image compression processing apparatus and compression processing method
JPH09271026A (en) Image encoding device
JP2010035133A (en) Moving image encoding apparatus and moving image encoding method
JP5560009B2 (en) Video encoding device
US8705628B2 (en) Method and device for compressing moving image
JP2009260892A (en) Image processing apparatus, control method therefor, and program
JP2010057166A (en) Image coding apparatus, image coding method, integrated circuit, and camera
JP2010232734A (en) Image encoding apparatus, and image encoding method
JP5583439B2 (en) Image encoding apparatus and camera system
JP4257789B2 (en) Video encoding device
JP6086619B2 (en) Encoding apparatus and encoding method
CN102300091B (en) Code device, electronic equipment, imaging device and imaging system
JP2008141354A (en) Image coding apparatus and imaging apparatus
JP2010239423A (en) Photographing resolution predictive video encoding and decoding apparatus
WO2012008117A1 (en) Image encoding device and camera
JP2007228101A (en) Dynamic-image coding equipment
JP4533089B2 (en) Movie data generator
JP2006303987A (en) Image encoder and image encoding method
JP4564856B2 (en) Image encoding apparatus and imaging apparatus
JP2004180345A (en) Photographed image recording apparatus
JP4335779B2 (en) Encoding apparatus, recording apparatus using the same, encoding method, and recording method
JP2005323315A (en) Prediction information/quantization value control compression coding apparatus, program, and method
JP2018191136A (en) Encoding device, encoding method and program
JP4834535B2 (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11806453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11806453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP