WO2011091755A1 - Procédé et dispositif pour traiter une reproduction vidéo à images multiples - Google Patents

Procédé et dispositif pour traiter une reproduction vidéo à images multiples Download PDF

Info

Publication number
WO2011091755A1
WO2011091755A1 PCT/CN2011/070688 CN2011070688W WO2011091755A1 WO 2011091755 A1 WO2011091755 A1 WO 2011091755A1 CN 2011070688 W CN2011070688 W CN 2011070688W WO 2011091755 A1 WO2011091755 A1 WO 2011091755A1
Authority
WO
WIPO (PCT)
Prior art keywords
video image
picture video
sub
value
pixel
Prior art date
Application number
PCT/CN2011/070688
Other languages
English (en)
Chinese (zh)
Inventor
王浦林
Original Assignee
华为终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为终端有限公司 filed Critical 华为终端有限公司
Priority to EP11736637.7A priority Critical patent/EP2521351B1/fr
Publication of WO2011091755A1 publication Critical patent/WO2011091755A1/fr
Priority to US13/562,003 priority patent/US8947498B2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/20Circuitry for controlling amplitude response
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data

Definitions

  • Multi-picture video image processing method and device The present application claims to be submitted to the Chinese Patent Office on January 29, 2010, and the application number is CN 201010104939. 4.
  • the Chinese patent application whose invention name is "multi-picture video image processing method and device” is preferred. The entire contents of which are incorporated herein by reference.
  • TECHNICAL FIELD The present invention relates to the field of video image processing, and in particular, to a multi-picture video image processing method and apparatus.
  • BACKGROUND OF THE INVENTION With the development of coding and information compression technologies and the rapid development of digital networks, video conferencing systems have begun to enter the market.
  • the video conferencing system based on the H. 323 video conferencing standard is being widely used with the rapid development of IP networks. Government, military, and enterprise departments have basically deployed their own video conferencing systems to improve the efficiency of meetings and reduce the cost of meetings.
  • the mode of video conferencing has evolved from a single point-to-point call and a single-screen display of the other party's video mode to a conference mode in which multiple conference venues simultaneously perform simultaneous conferences, multiple screen displays, or multiple screen displays simultaneously.
  • a different site picture is composed of a multi-screen display, or multiple venues are output to multiple display devices at the same time, due to different venues, the scene environment and lighting, camera and other venue equipment will affect the video atmosphere. The same, especially the chroma, brightness, etc. will show a certain difference.
  • the participants will be inconsistent. The feeling of inconsistency causes a reduction in the visual experience.
  • Option One In order to reduce the above effects and enhance the visual experience of the participants, the prior art provides the following two solutions.
  • Adjusting the source screen that is, adjusting the effect of the site input before the screen of the site is output to the destination site.
  • the site B
  • each site individually adjusts the parameters of the respective video capture device, such as adjusting the color and brightness parameters of the camera, or by the automatic adjustment function supported by the camera, the effect of the respective site screen is performed. Improvements are made to ensure that the effects of the respective venues are optimal and then output to the destination venue for splicing into multiple images.
  • the venues have certain differences in the video capture device, and the influence of the light intensity and the color of the lights, the venue environment is not the same. For example, some venues are cold-colored lights, and the entire venue is cold. Hue; some venues are warm tones, and the entire venue is warm. According to the technical means provided in the above scheme 1, the venues will not be able to fully consider other meetings when they adjust the screen effect by themselves.
  • the difference in the environment of the field in this way, when the pictures of the various venues are grouped together into a multi-picture, there will be a significant difference in the brightness and color characteristics of each sub-picture in the multi-picture, which makes the visual experience worse.
  • Output screen adjustment that is, adjust the effect of the output of the venue. For example, in a multi-screen conference, when a multi-screen is received, by adjusting the output mode of the display device (for example, parameters such as color and brightness), each sub-screen is simultaneously changed, and the output effect of the entire multi-screen is uniformly adjusted. .
  • the output mode of the display device for example, parameters such as color and brightness
  • the embodiments of the present invention provide a method and an apparatus for processing a multi-picture video image, which solves the problem that the visual experience of the participant is deteriorated due to the inability to simultaneously satisfy the characteristics of each sub-picture when adjusting the multi-picture.
  • a multi-picture video image processing method comprising: receiving a data code stream of a multi-channel sub-picture video image; respectively, using a control parameter to separately equalize an effect of the multi-channel sub-picture video image according to an image feature of the sub-picture video image; Multi-channel sub-picture video image synthesis multi-picture video picture.
  • a multi-picture video image processing apparatus comprising: an equalization module, configured to receive a data code stream of a multi-channel sub-picture video image, and respectively equalize the effect of the multi-channel sub-picture video image according to image features of the sub-picture video image by using control parameters And a synthesizing module, configured to synthesize the multi-picture video image that is equalized by the equalization module into the multi-picture video image.
  • the image features of the multi-channel sub-picture video image are acquired, and the adjustment coefficients are independently calculated according to the image features of the acquired sub-picture video image, and the sub-pictures from each individual site are used by using the same control parameters and the independently calculated adjustment coefficients.
  • the video image is equalized before the multi-picture video image is formed. Since the invention separately calculates the adjustment coefficient for each independent sub-picture video image according to the same control parameter, after the adjustment coefficient and the same control parameter are equalized, the respective sub-picture video images can be uniformly adjusted to have the same image feature.
  • FIG. 1 is a schematic flowchart of a multi-picture video image processing method according to a first embodiment of the present invention
  • FIG. 2 is a schematic flowchart of an effect method for equalizing multi-channel sub-picture video images by using control parameters according to Embodiment 2 of the present invention
  • FIG. 3 is a schematic flowchart of a multi-picture video image processing method according to Embodiment 3 of the present invention
  • FIG. 4 is a schematic flowchart of an effect method for equalizing multi-channel sub-picture video images by using control parameters according to Embodiment 4 of the present invention
  • FIG. 5 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 5 of the present invention
  • FIG. 6 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 6 of the present invention
  • FIG. 8 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 8 of the present invention
  • FIG. 9 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 8 of the present invention
  • FIG. 10 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 10 of the present invention
  • FIG. 11 is a multi-picture video image processing according to Embodiment 11 of the present invention
  • FIG. 12 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 12 of the present invention
  • FIG. 13 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 13 of the present invention
  • Figure 1144 is a multi-screen video provided by the present invention in accordance with the embodiment of the invention.
  • FIG. 15 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 15 of the present invention
  • FIG. 16 is a multi-picture video image processing apparatus provided in Embodiment 16 of the present invention.
  • FIG. 17 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 17 of the present invention;
  • FIG. 17 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 17 of the present invention.
  • FIG. 18 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 18 of the present invention
  • FIG. 1199 is a schematic structural diagram of a multi-picture video image processing apparatus provided by the present invention, which is provided by the embodiment of the present invention.
  • FIG. 20 is a multi-picture provided by the embodiment 20 of the present invention.
  • the structure of the video image processing apparatus is shown in the following.
  • the technical solutions in the embodiments of the present invention are clearly and completely described in conjunction with the drawings in the embodiments of the present invention. It is obvious that the described embodiments are only a part of the present invention. Embodiments, not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention.
  • the sub-picture video image from each single site is equalized by the same control parameter and the independently calculated adjustment coefficient before the multi-picture video image is formed, so as to be finally formed.
  • the effects of multi-picture video images can reflect the same characteristics.
  • FIG. 1 is a schematic flowchart of a multi-picture video image processing method according to Embodiment 1 of the present invention, which mainly includes the following steps:
  • the multi-channel sub-picture video image refers to a set of sub-picture video images from a single site (or venue unit) in a video conference.
  • 5102 Combine the equalized multi-channel sub-picture video image into a multi-picture video image.
  • a basic flow diagram of an effect method for equalizing multi-channel sub-picture video images by using control parameters according to Embodiment 2 of the present invention includes the following steps:
  • the overall effect of the sub-picture video image is the result of the interaction of all the pixel points (pixels) in the sub-picture
  • obtaining the image features of the sub-picture video image does not require the parameters of all the pixel points, SP.
  • SP the parameters of all the pixel points
  • the histogram statistical method can be used to count the pixel points of the luminance (Luminance) value in the frame [ZJU/B, BJU/B] in the video image of one frame of the sub-picture.
  • AJ /B is greater than or equal to 0
  • BJ /B is less than or equal to Lm
  • Lm is the maximum value used to describe the brightness standard.
  • Lm may take 255, which corresponds to the maximum value used when describing the luminance standard; if 16-bit corresponds to decimal, the representation method of the luminance value is described.
  • Lm takes 65535, which corresponds to the maximum value used when describing the brightness standard, and the present invention does not limit this.
  • the pixel point usage control parameter of the current sub-picture video image and the adjustment coefficient may be equalized by the pixel value of the luminance value in the interval [AJ /B, BJ /B] and the luminance value corresponding to the range of pixel points.
  • Current sub painting The effect of the face video image - the brightness value chart, from the pixel point - brightness value chart can determine the brightness characteristics of the current sub-picture video image.
  • the luminance values of a large number of pixels (for example, more than 80% of pixels) in the pixel-luminance value graph are smaller than the luminance values when one frame of image has normal luminance (for example, the luminance value is 100), Then, it can be determined that the brightness characteristic of the current sub-picture video image is “image darkness”, and the brightness of the current sub-picture video image needs to be appropriately increased by some means; conversely, if there are a large number of pixel points in the pixel-luminance value statistics graph (For example, more than 80% of the pixels) the brightness value is greater than the brightness value of one frame of image with normal brightness (for example, the brightness value is 100), then it can be determined that the brightness characteristic of the current sub-picture video image is "image is bright ", the brightness of the current sub-picture video image needs to be appropriately reduced by some means.
  • the histogram statistical method can also be used to count the pixel points of the chroma image (Chroma) value in a frame of the sub-picture video image within a certain range to determine the chrominance characteristics of the sub-picture video image, for example, by counting one frame sub-picture.
  • the redness (CR, Chroma Red) value or the blueness (CB, Chroma Blue) value is within a certain range of pixels to determine the chromaticity characteristics of the sub-picture video image. Since white is the basic color and the luminance value is large, for example, it is usually 200 or more (the value when the chromaticity value is represented by the 8-bit corresponding decimal).
  • the embodiment of the present invention For the statistics of the CR value or the CB value, by counting the pixel points in the video image of the current sub-picture that are close to the chromaticity value of the white area, the pixel points close to the chromaticity value of the white area and the corresponding chromaticity values thereof A pixel point-chrominance value chart is constructed, and the pixel feature of the current sub-picture video image is determined by the pixel-chrominance value chart.
  • the CR value or the CB value is close to 128 (the value when the chromaticity value is represented by an 8-bit corresponding decimal), and in view of this, in the embodiment of the present invention, the chromaticity value can be [128- The pixels in T2, 128+ T2] are counted.
  • the pixel value of the chrominance value in [128-T2, 128+ T2] is the pixel point close to the chromaticity value of the white area, so if it passes statistics In the pixel-chroma value chart constructed by the pixel values in the current sub-picture video image where the chrominance value is close to the chromaticity value of the white area, the chrominance value (CR value or CB value) of most of the pixel points is 128.
  • the chromaticity feature of the current sub-picture video image is chromaticity offset, such as blue, green or reddish, etc., and the chromaticity of the current sub-picture video image needs to be appropriately adjusted by some means. .
  • the received sub-picture video image may have a luminance offset or a luminance offset.
  • an adjustment coefficient may be calculated according to the control parameter and the image feature of the current sub-picture video image, and the adjustment coefficient and the control parameter are used to correct the chromaticity offset or the luminance deviation of the sub-picture video image. Move to balance the effect of the sub-picture video image.
  • the first brightness adjustment coefficient CL1 and the second brightness adjustment coefficient of the current sub-picture video image may be calculated according to the image feature of the current sub-picture video image and a predetermined or set one control parameter.
  • CL2 wherein the control parameter may be a reference value used when the current sub-picture video image is equalized to the target brightness value, and is represented by Lo in this embodiment. Using the reference value Lo, it is ensured that the current sub-picture video image is neither bright nor dark after being corrected.
  • the first brightness adjustment factor CL1 and the second brightness adjustment coefficient CL2 for calculating the current sub-picture video image may specifically be:
  • the pixel point in the pixel-brightness value graph in the interval [Api, Bpi] is counted from the pixel value of the pixel value Api, and when the Kth pixel is counted, the Kth pixel is obtained.
  • the pixel value ⁇ 0 of the ⁇ /2 pixel is obtained, where ⁇ is the pixel value of the current sub-picture video image-luminance value statistics
  • the number of pixels in the figure that is, the number of pixel points sampled in the pixel point-luminance value chart.
  • a first brightness adjustment coefficient CL1 and a second brightness adjustment coefficient CL2 are calculated by Api Bpix, P0 and Lo, wherein the first brightness adjustment coefficient CL1 and the second brightness adjustment coefficient CL2 are both linear with Lo.
  • the first brightness adjustment coefficient CL1 may be Lo/(?0- ⁇ ) and the second brightness adjustment coefficient CL2 may be Lo/(B i - ⁇ ) from Ap_z , Bpi , P0 and Lo, obviously CL1 and CL2 are both It is linear with Lo.
  • CL1 and CL2 are both It is linear with Lo.
  • the reference value Lo used is 128, and the current sub-picture video image has N pixel points in the pixel-luminance value chart, and the pixel value is [0]. , 255] (255 is the maximum pixel value when the pixel value is represented by 8-bit corresponding decimal), and counting starts from the pixel whose pixel value is 0.
  • the pixel value P0 of the N/2th pixel is obtained, where the pixel value P0 is on the interval [0, 255].
  • the first chrominance adjustment coefficient CC1 and the first sub-picture video image may also be calculated according to the image characteristics of the current sub-picture video image and a predetermined or set control parameter.
  • the dichroic adjustment coefficient CC2 wherein the control parameter may be a reference value used when the current sub-picture video image is equalized to the target chromaticity value, and is represented by Co in the embodiment. Using the reference value Co, it is ensured that the current sub-picture video image is not offset after being corrected.
  • the first chromaticity adjustment coefficient CC1 and the second chromaticity adjustment coefficient CC2 of the current sub-picture video image may be specifically:
  • the pixel value of the pixel value in the pixel-chroma value chart in the interval [Jpi ⁇ Kpi ] is counted from the pixel value of the pixel value J i , and when the J pixel point is counted, a pixel value P1 of the Jth pixel, where J i is greater than or equal to 0, K i is less than or equal to Y, and ⁇ is a maximum value used when describing a pixel value standard, for example, 255 or 65535;
  • a pixel point in the interval [Jpi Kpi ] it can be counted to half of the pixel point in the pixel point-chroma value chart, that is, when counting to the second /2 (when ⁇ /2 is not In the case of an integer, ⁇ /2 can take a pixel closest to it, and the pixel value P1 of the ⁇ /2 pixel is obtained, where ⁇ is the pixel-chrominance value of the current sub-picture video image.
  • the number of pixels in the statistical graph that is, the number of pixel points sampled in the pixel-chroma value chart.
  • the first chromaticity adjustment coefficient CC1 and the second chromaticity adjustment coefficient CC2 are calculated by Jpi pix, PI and Co, where the first chromaticity adjustment coefficient CC1 and the second chromaticity adjustment coefficient CC2 are both The reference value Co is linear.
  • the first brightness adjustment coefficient CC1 may be Co/(Pl -J i ) and the second brightness adjustment coefficient CC2 may be Co/(K i -Pl ) from Jpi, pix, PI and Co, obviously CC1 and CC2 are both It is linear with Co.
  • the current sub-picture video image is equalized to the target chrominance value
  • the reference value Co is 128, and the current sub-picture video image has N pixels in the pixel-chroma value chart.
  • the dot has a pixel value of [0, 255] (255 is the maximum pixel value when the pixel value is represented by an 8-bit corresponding decimal), and is counted from a pixel whose pixel value is 0.
  • the pixel value P1 of the N/2th pixel is obtained, where the pixel value P1 is within [0, 255].
  • the adjustment mode of the embodiment enables the sub-picture video image to exhibit the same picture characteristics when displayed.
  • S203 Equalize the effect of the current sub-picture video image by using the control parameter and the adjustment coefficient.
  • the brightness value of the pixel point of the pixel value in the interval [Api, P0] can be linearly corrected using the first brightness adjustment coefficient CL1 calculated in the reference values Lo and S202.
  • the luminance value L2 of the current sub-picture video image on the [P0, Bpix] section is obtained.
  • the so-called linear correction means that the brightness value of the pixel on the interval [Api, P0] is increased or decreased according to the same ratio, since each pixel is increased or decreased according to the same ratio, The visual effect of the linear correction of the entire sprite image is coordinated, and there is no partial darkness or partial brightness.
  • the pixel point in the current sub-picture video image may be The brightness value in the interval [Api ⁇ P0] is corrected to CL1 X (P-Apix) and output, and the brightness value of the pixel in the current sub-picture video image in the interval [P0, Bpi] is corrected to BJuffl — (CL2 X ( ⁇ - ⁇ ) ) After output, here, ⁇ is the pixel value of the pixel in the current sub-picture video image before being equalized.
  • the pixel of the pixel value in the interval [Jpi ⁇ PI] may be used using the reference value Co and the first chromaticity adjustment coefficient CC1 calculated in S202.
  • the chromaticity value of the point is linearly corrected to obtain the chrominance value C1 of the current sub-picture video image on the [Jpi, PI] interval; the reference value Co and the second chromaticity adjustment coefficient CC2 calculated in S202 are used to the pixel value.
  • the chromaticity values of the pixel points on the interval [Pl, Kpi] are linearly corrected to obtain the chromaticity value C2 of the current sub-picture video image in the [Pl, Kpi] section.
  • the color of the video image of the current sub-picture is
  • the chromaticity value of the pixel in the current sub-picture video image in the interval [Jpi ⁇ PI] can be corrected to CC1 X CP-]pix), and the pixel in the current sub-picture video image is
  • the chrominance value on the interval [Pl, Kpi] is corrected to KcAr—( CC2 X a P ix-?) and output, where P is the pixel value of the pixel in the current sub-picture video image before being equalized, KcAr It is the chromaticity value of the pixel point whose pixel value is in the interval [Jpi, Kpi], which is numerically equal to Kpi.
  • control parameters ie, the reference value Lo or the reference value Co
  • the control parameters are not always fixed. If the control parameters are fixed, it indicates that the control parameters are fixed during the processing of the sub-picture video image for a long period of time.
  • the current sub-picture video image can be equalized to the target chrominance value.
  • the reference value Lo or Co used is fixed at 128. If the processing flow of the embodiment is modified, SP, as shown in FIG.
  • the operation of updating the control parameter is performed, indicating that During the processing of the sub-picture video image for a long time, the control parameter can be obtained by detecting the overall feature of the sub-picture video image, for example, the control parameter can be controlled in a range close to the plurality of sub-picture video images, that is, if the input sub- The picture video images all exhibit similar Lum/CB/CR features (for example, multiple input sub-picture video images are displayed as dark), and the long time is around 100, then the reference value Lo in the control parameters can be uniformly controlled. At about 100, the image features of the sub-picture video images with similar styles are maintained, and only the image features of a few sub-picture video images with inconsistent styles are uniformly adapted into the overall picture features.
  • each sub-picture can be The video image is uniformly adjusted to have the same image feature, so that the sub-picture video image exhibits the same picture characteristics when displayed, so that the display of the multi-picture video image composed of the sub-picture video image achieves the overall style harmony and enhances the participants' Visual experience.
  • a basic flow diagram of an effect method for equalizing multi-channel sub-picture video images by using control parameters according to Embodiment 4 of the present invention includes the following steps:
  • the environment of the input video image is basically fixed, and the temporal similarity is very high, that is, the image of the current frame sub-picture video image.
  • the features are very similar to the image features of the next sub-picture video image. Therefore, different from the use of the control parameter to balance the current sub-picture video image provided by the second embodiment of the present invention, in this embodiment, the control parameter and the weighting adjustment coefficient of the m-frame sub-picture video image before the current sub-picture video image Fn may be used.
  • the effect of the received current sub-picture video image Fn and the image feature of the current sub-picture video image Fn are simultaneously acquired.
  • the image feature of the sub-picture video image of each frame is required as a basis, and m (m is a natural number greater than or equal to 1) frame before Fn.
  • the adjustment coefficients of the video image of each frame of the picture video image are weighted.
  • the obtaining step of the weighting adjustment coefficient of the m frame sub-picture video image before the current sub-picture video image Fn includes: S4011: Obtain image features of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn;
  • the image feature of each frame of the sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn is acquired is similar to the image feature of the current sub-picture video image Fn in the first embodiment of the present invention, for example, From the perspective of simplicity and easy implementation, the histogram statistical method can also be used to collect the luminance value of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn in the interval [SJ /B, T uffl On the pixel point, mid-term, SJuffl is greater than or equal to 0, T / B is less than or equal to Lm, and Lm is defined as the same as the previous embodiment, which is the maximum value used when describing the luminance standard.
  • Lm takes 255, which corresponds to the maximum value used when describing the luminance standard; if 16-bit corresponds to decimal, the representation of the luminance value is described, Lm is taken, 65535 corresponds to the maximum value used when describing the brightness standard, and the present invention is not limited thereto.
  • each frame of the sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn can be constructed from the pixel values of the luminance values in the interval [SJ /B, TJ /B] and the luminance values corresponding to the pixel points.
  • the pixel point-luminance value chart can determine the brightness characteristic of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn from the pixel-luminance value chart.
  • the luminance value of a large number of pixels (for example, more than 80% of pixels) in the pixel-luminance value graph is smaller than the luminance value when a video image has normal luminance (for example, the luminance value is 100).
  • determining that the brightness characteristic of each frame of the sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn is “image darkening”, and the brightness of the sub-picture video image needs to be appropriately increased by some means;
  • the luminance value of a large number of pixels (for example, more than 80% of the pixels) in the pixel-luminance value graph is greater than the luminance value when one frame of the video image has normal brightness (for example, the luminance value is 100)
  • the brightness characteristic of the sub-picture video image is "image bright", and the brightness of the sub-picture video image needs to be appropriately reduced by some means.
  • the histogram statistical method may also be used to calculate the pixel point of the chroma (Chroma) value in each frame sub-picture video image of the m-frame sub-picture video image before the current sub-picture video image Fn to determine the current sub-score.
  • the chrominance feature of each frame of the sub-picture video image in the m-frame sub-picture video image before the picture video image Fn for example, by counting the current sub-picture video image Fn before the m-frame sub-picture video image in each frame of the sub-picture video image
  • the redness (CR, Chroma Red) value or the blueness (CB, Chroma Blue) value is within a certain range of pixels to determine the chromaticity characteristics of the sub-picture video image. Since white is the basic color and the brightness value is large, for example, it usually reaches 200 (the value when the chromatic value is represented by the 8-bit corresponding decimal).
  • the statistics with the image brightness value are Differently, in the embodiment of the present invention, for the statistics of the CR value or the CB value, the chromaticity value and the white area chromaticity in each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn are counted.
  • a pixel whose value is close, a pixel point-chroma value chart is constructed by a pixel point close to the white region chromaticity value and its corresponding chrominance value, and the current sub-picture video is determined by the pixel point-chroma value chart.
  • the CR value or the CB value is close to 128 (the value when the chromaticity value is represented by the 8-bit corresponding decimal), and in view of this, in the embodiment of the present invention, the chromaticity value can be in the interval [128] The pixels on -T2, 128+ T2] are counted.
  • the pixel value of the chrominance value in the interval [128-T2, 128 + T2] is the pixel point close to the chromaticity value of the white region, and therefore, if passed Counting the pixel value of the pixel point/chroma value constructed by the pixel points in which the chrominance value and the white area chromaticity value are close to each other in the m frame sub-picture video image before the current sub-picture video image Fn, If the chrominance value (CR value or CB value) of some pixels is smaller than 128 or larger than 128, the chromaticity of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn can be determined.
  • the feature is that there is a chromaticity offset, such as bluish blue, green or reddish, etc., and the chromaticity of each frame of the sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn needs to be appropriately adjusted by some means.
  • a chromaticity offset such as bluish blue, green or reddish, etc.
  • S4012 Calculate a weight adjustment coefficient of the m frame sub-picture video image before the current sub-picture video image Fn according to the control parameter and the image feature of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn.
  • the weighting adjustment coefficient of the m-frame sub-picture video image before the current sub-picture video image Fn is calculated by adjusting the sub-picture video image of each frame of the m-frame sub-picture video image before the current sub-picture video image Fn. The coefficients are weighted and obtained.
  • the first brightness weight of the m frame sub-picture video image before the current sub-picture video image Fn may be calculated according to the control parameter and the image feature of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn.
  • the adjustment coefficient C' L1 and the second brightness weighting adjustment coefficient C' L2, wherein the control parameter may use the reference value L used when the sub-picture video image of each frame in the first m frame of the current sub-picture video image Fn is equalized to the target brightness value. ' 0, used to ensure that the current sub-picture video image Fn is neither bright nor dark after being corrected.
  • the first luma weighting adjustment coefficient C' L1 and the second luma weighting adjustment coefficient C' L2 of the m-frame sub-picture video image include:
  • the pixel value Pw of the Nj/2th pixel point is obtained, where Nj is The pixel point of the j-th frame sub-picture video image - the number of pixels in the luminance value statistics graph, that is, the number of pixel sample points counted in the pixel point-luminance value chart.
  • the first brightness weighting adjustment factor C' L1 m p can be calculated from Spi , ⁇ , ⁇ , 0 and L' o
  • the reference value L' 0 used to equalize each frame sub-picture video image in the first m frame of the current sub-picture video image Fn to the target brightness value is 128, the m-frame sub-picture video image before the current sub-picture video image Fn In the pixel point of the j-th frame sub-picture video image, there are Nj 'pixel points in the luminance value statistics chart, and the pixel value is in the interval [0, 255] (255 is the maximum when the pixel value is represented by 8-bit corresponding decimal Pixel value), for the j-th frame sub-picture video image, counting is started from a pixel point having a pixel value of 0.
  • the pixel value of the Nj ' /2 pixel points is obtained.
  • the pixel value is on the interval [0, 255].
  • Adjustment factor C, L2 128/ ( 255- ( )).
  • the image characteristics of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn and a predetermined or set control parameter may also be used.
  • Calculating a first chrominance weighting adjustment coefficient C' C1 and a second chrominance weighting adjustment coefficient C' C2 of the current sub-picture video image Fn wherein the control parameter may be to equalize the current sub-picture video image to a target chromaticity value
  • the reference value C' o used at the time is used to ensure that the current sub-picture video image Fn is not offset after being corrected.
  • the first chroma-weighted adjustment coefficient C' C1 and the second chroma-weighted adjustment coefficient C' C2 of the m-frame sub-picture video image include:
  • the pixel value of the pixel image in the pixel-chroma value chart is in the interval [Upi ⁇ Vpi], and the pixel point is from the pixel value of the Upi pixel.
  • Count where Jpix is greater than or equal to 0, V i is less than or equal to Y, ⁇ is the maximum value used to describe the pixel value criteria;
  • the k-th frame sub-picture video image in the m-frame sub-picture video image by counting up to half of the pixel point in the k-th frame sub-picture video image pixel point-luminance value chart, that is, when counting Nk/2 (when
  • Nk/2 When Nk/2 is not an integer, Nk/2 can be taken as the nearest integer) pixel point, and the pixel value of the Nk/2th pixel point is obtained, where Nk is the k-th frame sub-picture video image.
  • the number of pixels in the pixel-brightness value chart that is, the number of pixel points sampled in the pixel-brightness value chart.
  • S' 40123, summing 3 ⁇ 4, get P, 1 , ⁇ , ⁇ ;
  • the first chrominance weighting adjustment coefficient C' C1 can be calculated from Uw, ⁇ , ⁇ , 1 and C' o as C' o/ ( ⁇ ' l ⁇ pix) , where P' 1 - is calculated to obtain a second chromaticity weighted adjustment coefficient C' C2 as C' o / ( ⁇ - ⁇ ' 1 ), and likewise, ⁇ , 1
  • C, C1 and C' C2 are linear with C' 0.
  • the second brightness weighting adjustment factor of the video image C' L2 128/ ( 255- (
  • the current sub-picture video image is equalized by using the adjustment coefficient of the current sub-picture video image Fn.
  • the effect of the Fn is different.
  • the effect of the current sub-picture video image Fn is equalized by the adjustment coefficient of the m-frame sub-picture video image Fn-1 of the current sub-picture video image Fn, for example, for equalizing the current sub-picture video image Fn.
  • the brightness including the following steps:
  • the pixel points in the interval [Mpi, Npi] it is possible to count to the half of the pixel point in the pixel point-luminance value chart, that is, when counting to the second /2 (when ⁇ /2 is not In the case of an integer, you can take ⁇ /2 to the nearest integer) pixel point, and get the pixel value ⁇ 2 of the ⁇ /2 pixel, where ⁇ is the pixel value of the current sub-picture video image-luminance value statistics
  • the number of pixels in the figure that is, the number of pixel points sampled in the pixel point-luminance value chart.
  • the brightness value of the pixel in the current sub-picture video image in the interval [Mpi ⁇ P2] can be corrected to C' L1 X (PM i )
  • the brightness value of the pixel in the current sub-picture video image is corrected to NJ /B—( C L2 X (N i -P)) and then output, where P is the current sub-picture.
  • the pixel value in the current sub-picture video image Fn pixel-chroma value chart is in the interval [Xpi ⁇ The pixel point on Y i ] is counted from the pixel point with the pixel value of Xi, and when the Tth pixel point is counted, the pixel value P3 of the Tth pixel point is obtained, where Xpi is greater than or equal to 0, Ypi Less than or equal to Y, ⁇ is the maximum value used to describe the pixel value standard;
  • ⁇ /2 can take a pixel closest to it, and the pixel value ⁇ 3 of the ⁇ /2 pixel is obtained, where ⁇ is the pixel-chrominance value of the current sub-picture video image.
  • the number of pixels in the statistical graph that is, the number of pixel points sampled in the pixel-chroma value chart.
  • the current sub-picture video image can be The chromaticity value of the pixel in the interval [Xpi ⁇ P3] is corrected to C' C1 X (P-Xpi output, and the chromaticity value of the pixel in the current sub-picture video image in the interval [P3, Ypi] is corrected.
  • the image feature of the current sub-picture video image Fn is similar to the image feature of the current sub-picture video image Fn in the first embodiment of the present invention, for example, for acquiring the current sub-picture video image.
  • the brightness characteristics of Fn include:
  • the brightness characteristic of the current sub-picture video image Fn is determined from the pixel point-luminance value chart.
  • the chrominance characteristics for acquiring the current sub-picture video image Fn include: Counting pixel points in the current sub-picture video image Fn that are close to the chromaticity value of the white area;
  • the chromaticity characteristic of the current sub-picture video image Fn is determined from the pixel-chroma value chart.
  • the adjustment coefficient before the current sub-picture video image Fn is equalized is used to weight the adjustment coefficient and the adjustment coefficient of the sub-picture video image before the current sub-picture video image Fn for the next frame of the current sub-picture video image Fn.
  • the effect of using the control parameters provided in the fourth embodiment of the present invention to equalize the multi-channel sub-picture video images separately can be known. Since the present invention separately calculates the adjustment coefficients for each independent sub-picture video image according to the same control parameters, After the adjustment coefficient and the same control parameter are equalized, the sub-picture video images can be uniformly adjusted to have the same image feature, so that the sub-picture video image exhibits the same picture characteristics when displayed, thereby making the sub-picture video image The display of multi-picture video images achieves an overall style harmony and enhances the visual experience of the participants.
  • FIG. 5 a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 5 of the present invention is shown.
  • the functional modules included in the device can be software modules, hardware modules or a combination of software and hardware.
  • the equalization module 51 is configured to receive a data code stream of the multi-channel sub-picture video image, and separately balance the effect of the multi-channel sub-picture video image by using the control parameter according to the image feature of the sub-picture video image;
  • the compositing module 52 is configured to synthesize the sub-picture video image equalized by the equalization module 51 into a multi-picture video image.
  • the equalization module 51 may further include a first image feature acquisition sub-module 61, a first calculation sub-module 62, and a An equalization sub-module 63, as shown in FIG. 6, is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 6 of the present invention, wherein:
  • a first image feature acquisition sub-module 61 configured to acquire an image feature of a current sub-picture video image
  • a first calculation sub-module 62 configured to acquire a current sub-picture video image acquired by the sub-module 61 according to the control parameter and the first image feature Image feature calculation adjustment factor
  • the first equalization sub-module 63 is configured to calculate the effect of the current sub-picture video image by using the control parameter and the first calculation sub-module 62 to calculate the obtained adjustment coefficient.
  • the first image feature acquisition sub-module 61 may further include a first statistical unit 71, a first statistical map construction unit 72, and a luminance feature determination unit 73. As shown in FIG. 7, a multi-picture video image provided by Embodiment 7 of the present invention is provided.
  • the first statistic unit 71 is configured to calculate a pixel point of the luminance value in the current sub-picture video image in the interval [AJ /B, BJ /B], where AJuffl is greater than or equal to 0, and BJ /B is less than or equal to Lm, Lm corresponds to the maximum value used when describing the brightness standard;
  • the first chart construction unit 72 is configured to construct a pixel point-luminance value statistical graph by the pixel values in the interval [AJ /B, BJuffl] and the brightness values corresponding to the pixel points;
  • the brightness feature determining unit 73 is configured to determine a brightness characteristic of the current sub-picture video image from the pixel point-luminance value chart.
  • the first calculation sub-module 62 may further include a first counting unit 81 and a brightness adjustment coefficient calculating unit 82.
  • a counting unit 81 configured to count the pixel points of the pixel value in the pixel [intensity value graph] in the interval [Api, Bpi] from the pixel point with the pixel value A i , when counting the Kth pixel point , obtaining a pixel value P0 of the Kth pixel, wherein kpix and ⁇ are equal to ⁇ /» and BJuffl, respectively;
  • the brightness adjustment coefficient calculation unit 82 is configured to calculate, by Api, Bpix, P0 and Lo, a first brightness adjustment coefficient CL1 and a second brightness adjustment coefficient CL2, wherein the first brightness adjustment coefficient CL1 and the second brightness adjustment coefficient CL2 are both Linear relationship with Lo.
  • the first equalization sub-module 63 may further include a first luminance equalization sub-unit 91 and a second luminance equalization sub-unit 92, as shown in FIG. 9, a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 9 of the present invention, among them:
  • the first brightness equalization subunit 91 uses the reference value Lo and the first brightness adjustment coefficient CL1 to compare the pixel values in the interval.
  • the luminance value of the pixel on [kpix, P0] is linearly corrected to obtain the luminance value L1 of the current sub-picture video image in the [Api, P0] interval;
  • the second luminance equalization sub-unit 92 linearly corrects the luminance values of the pixel points whose pixel values are in the [P0, Bpi] section using the reference value Lo and the second luminance adjustment coefficient CL2 to obtain a [P0, Bpi] interval.
  • the first image feature acquisition sub-module 61 may further include a second statistical unit 101, a second statistical map construction unit 102, and a chrominance feature determination unit 103. As shown in FIG. 10, a multi-picture video provided by Embodiment 10 of the present invention is provided.
  • a schematic diagram of the structure of an image processing apparatus wherein:
  • a second statistic unit 101 configured to calculate a pixel point in the current sub-picture video image that has a chromaticity value close to a white area chromaticity value
  • a second chart construction unit 102 configured to construct a pixel point-chrominance value statistical graph by a pixel point close to a white area chromaticity value and a corresponding chromaticity value thereof;
  • the chrominance feature determining unit 103 is configured to determine a chrominance feature of the current sub-picture video image from the pixel point-chrominance value chart.
  • the first calculation sub-module 62 may further include a second counting unit 111 and a chromaticity adjustment coefficient calculation unit
  • FIG. 11 is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 11 of the present invention, wherein:
  • a second counting unit 111 configured to count, from the pixel point of the pixel value in the interval [Jpi Kpi], the pixel point in the pixel point-chrominance value chart from the pixel value of the pixel value J i, when counting to the Jth pixel point
  • Jpi is greater than or equal to
  • Kpi is less than or equal to Y
  • is the maximum value used when describing the pixel value standard
  • the chromaticity adjustment coefficient calculation unit 112 is configured to calculate a first chromaticity adjustment coefficient CC1 and a second chromaticity adjustment coefficient CC2 according to Jpi ⁇ pix, PI and Co, wherein the first chromaticity adjustment coefficient CC1 and the first The two-color adjustment coefficient CC2 is linear with Co.
  • the first equalization sub-module 63 may further include a first chrominance equalization sub-unit 121 and a second chrominance equalization sub-unit 122.
  • a multi-picture video image processing apparatus according to Embodiment 12 of the present invention Schematic diagram of the structure, where:
  • the first chrominance equalization sub-unit 121 linearly corrects the chromaticity value of the pixel point of the pixel value in the interval [Jpi ⁇ PI] using the reference value Co and the first chromaticity adjustment coefficient CC1 to obtain [Jpi ⁇ PI] current on the interval
  • the chroma value Cl of the sub-picture video image
  • the second chrominance equalization sub-unit 122 linearly corrects the chromaticity values of the pixel points of the pixel values in the interval [P1, Kpi] using the reference value Co and the second chrominance adjustment coefficient CC2 to obtain [Pl , Kpi ]
  • the equalization module 51 may further include a second image feature acquisition sub-module 131 and a second calculation sub-module 132.
  • FIG. 13 a schematic diagram of a multi-picture video image processing apparatus according to Embodiment 13 of the present invention, wherein:
  • the second image feature acquisition sub-module 131 is configured to: when receiving the current sub-picture video image Fn, use the control parameter and the weighting adjustment coefficient of the m-frame sub-picture video image before the current sub-picture video image Fn to balance the received current sub-picture video image Fn. Effect and simultaneously acquire image features of the current sub-picture video image Fn;
  • the second calculation sub-module 132 is configured to calculate, according to the image feature before the current sub-picture video image Fn acquired by the second image feature acquisition sub-module 131, the adjustment coefficient before the current sub-picture video image Fn is equalized, where m is greater than or A natural number equal to 1.
  • the second image feature acquisition sub-module 131 may further include a pre-m frame image feature acquisition sub-module 141 and a pre-m frame weight adjustment coefficient calculation sub-module 142. As shown in FIG. 14, a multi-screen according to Embodiment 14 of the present invention is provided.
  • a schematic diagram of the structure of a video image processing apparatus wherein:
  • the first m frame image feature acquisition sub-module 141 is configured to acquire image features of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn;
  • the pre-m frame weighting adjustment coefficient calculation sub-module 142 is configured to calculate the m-frame sub-picture video image before the Fn according to the control parameter and the image feature of each frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image Fn Weighted adjustment factor.
  • the first m frame image feature acquisition sub-module 141 may further include a first m frame first statistic unit 151, a pre-m frame first statistic construction unit 152, and a pre-m frame luminance feature determination unit 153. As shown in FIG. 15, the present invention is implemented.
  • a schematic diagram of a structure of a multi-picture video image processing apparatus provided in Example 15, wherein:
  • the first m frame first statistic unit 151 is configured to count the pixel points of the video image brightness value of each frame of the m frame sub-picture video image before the current sub-picture video image Fn in the interval [SJ /B, T /B]
  • the SJua is greater than or equal to 0, wherein T / B is less than or equal to Lm, and Lm is a maximum value used when describing a brightness standard;
  • the first m frame first chart construction unit 152 is configured to construct a m frame sub-picture video image before the current sub-picture video image Fn by the pixel values of the luminance values in the interval [SJ /B, 17 ] and the luminance values corresponding to the pixel points.
  • Each frame of the picture Pixel point-luminance value statistics of the video image;
  • the first m frame luminance feature determining unit 153 is configured to determine, according to the pixel point-luminance value statistics, a luminance feature of each frame of the sub-picture video image in the m frame sub-picture video image before the current sub-picture video image Fn.
  • the first m frame weighting adjustment coefficient calculation sub-module 142 may further include a first m frame first counting sub-unit 161, a pre-m frame first sum sub-unit 162, and a pre-m frame luminance weighting adjustment coefficient calculating unit 163, as shown in FIG.
  • FIG. 16 A schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 16 of the present invention, wherein:
  • the first m frame first counting sub-unit 161 is configured to: in the pixel point-luminance value statistical graph of each frame sub-picture video image in the m frame before the current sub-picture video image, the pixel value is in the interval [S i ⁇ T i ] The pixel points are counted from the pixel point whose pixel value is Spix. For the j-th frame sub-picture video image in the m-frame sub-picture video image before the current sub-picture video image, when the Q-th pixel point is counted, the Qth is obtained.
  • the pixel value of the pixel, where j is 1, 2, ..., m, Spix, Tpj are equal in value to ⁇ , respectively;
  • the first m frame luminance weighting adjustment coefficient calculation unit 163 is configured to be based on Spi ⁇ pix, P '0 and L' 0 , the first brightness weighting adjustment coefficient C' L1 and the second brightness weighting adjustment coefficient C' L2 are calculated, wherein the first brightness weighting adjustment coefficient C' L2 and the second brightness weighting adjustment coefficient C' L2 Both are linear with L' 0.
  • the second image feature acquisition sub-module 131 may include a current sub-picture video image first statistical unit 171, a current sub-picture video image first statistical form construction unit 172, a current sub-picture video image first counting unit 173, and a front m-frame first.
  • the current sub-picture video image first statistic unit 171 is configured to count the pixel points of the current sub-picture video image Fn in the interval [MJuffl, Wuffl], where MJuffl is greater than or equal to 0, and NJuffl is less than or equal to Lm, Lm The maximum value used to describe the brightness standard;
  • the current sub-picture video image first chart construction unit 172 is configured to construct a current sub-picture video image Fn pixel point-luminance value statistics according to the pixel values on the interval [MJ /B, NJ /B] and the brightness values corresponding to the pixel points.
  • the current sub-picture video image first counting unit 173 is configured to: for the pixel point of the current sub-picture video image Fn pixel point-luminance value statistical graph, the pixel point in the interval [Mpi ⁇ Npi] starts from the pixel point with the pixel value of Mpi Counting, when counting to the Pth pixel point, obtaining the pixel value P2 of the Pth pixel point, wherein Mpi and ⁇ are equal to ⁇ 0/», Wuffl, respectively;
  • the first m frame first luminance equalization subunit 174 uses the reference value Lo and the first luminance weighting adjustment coefficient C' L1 object
  • the luminance value of the pixel value of the prime value in the interval [Mpi P2] is linearly corrected to obtain the luminance value L' 1 of the current sub-picture video image in the interval [Mpi , P2];
  • the first m frame second luminance equalization sub-unit 175 linearly corrects the luminance values of the pixel values of the pixel values in the interval [P2, Npi] using the reference value Lo and the second luminance weighting adjustment coefficient C' L2 to obtain [P2, The luminance value L' 2 of the current sub-picture video image on the Npi ] interval.
  • the pre-m frame image feature acquisition sub-module 141 may include a pre-m frame second statistic unit 181, a pre-m frame second statistic construction unit 182, and a pre-m frame chrominance feature determination unit 183. As shown in FIG. 18, the present invention is implemented.
  • a structural diagram of a multi-picture video image processing apparatus provided in Example 18, wherein:
  • the first m frame second statistic unit 181 is configured to count the pixel points of the chrominance value of each frame sub-picture video image and the gamut value of the white area in the m frame sub-picture video image before the current sub-picture video image Fn;
  • the first m frame second chart construction unit 182 is configured to construct each frame in the m frame sub-picture video image before the current sub-picture video image Fn according to the pixel point close to the white area chromaticity value and its corresponding chrominance value. Pixel-chrominance value chart of the video image of the picture;
  • the first m frame chrominance feature determining unit 183 is configured to determine a current sub-picture video image Fn from a pixel point-chrominance value chart of each frame sub-picture video image in the m frame sub-picture video image before the current sub-picture video image Fn.
  • the first m frame weighting adjustment coefficient calculation sub-module 142 may further include a first m frame second counting sub-unit 191, a pre-m frame second sum sub-unit 192, and a pre-m frame chrominance weighting adjustment coefficient calculating unit 193, as shown in FIG.
  • FIG. A schematic diagram of a structure of a multi-picture video image processing apparatus according to Embodiment 19 of the present invention, wherein:
  • the first m frame second counting sub-unit 191 is configured to: in the m-frame sub-picture video image of the current sub-picture video image, each frame sub-picture video image pixel point-chrominance value chart has a pixel value in the interval [Upi ⁇ Vpi The pixel on the top is counted from the pixel with the pixel value of Upi. For the k-th sub-picture video image in the video image of the previous m-frame sub-picture, when the W-th pixel is counted, the W-th pixel is obtained.
  • the first m frame chroma weighting adjustment coefficient calculation unit 193 is configured to calculate the first chroma weighting adjustment coefficient C' C1 and the second chroma weighting adjustment coefficient C' C2 according to Upi, ⁇ , ⁇ ' 1 and C' o
  • the second chrominance weighting adjustment coefficient C' C2 and the second chrominance weighting adjustment coefficient C' C2 are both in a linear relationship with C'0.
  • the second image feature acquisition sub-module 131 may include a current sub-picture video image second statistical unit 201, a current sub-picture video image second statistical form construction unit 202, a current sub-picture video image second counting unit 203, and a front
  • the m-frame first chrominance equalization sub-unit 204 and the pre-m-frame second chrominance equalization sub-unit 205, as shown in FIG. 20, is a schematic structural diagram of a multi-picture video image processing apparatus according to Embodiment 20 of the present invention, wherein :
  • the current sub-picture video image second statistic unit 201 is configured to count pixel points in the current sub-picture video image Fn that are close to the chromaticity value and the white area chromaticity value;
  • the current sub-picture video image second chart construction unit 202 is configured to construct a current sub-picture video image Fn pixel point-chrominance value statistical map according to the pixel point close to the white area chromaticity value and its corresponding chrominance value;
  • the current sub-picture video image second counting unit 203 is configured to: the pixel value of the pixel value in the current sub-picture video image Fn pixel point-chrominance value chart in the interval [X i ⁇ ⁇ ⁇ from the pixel value of the pixel value Start counting, when counting to the Tth pixel point, the pixel value P3 of the Tth pixel point is obtained, where X i is greater than or equal to 0, Y i is less than or equal to ⁇ , ⁇ is the maximum used to describe the pixel value standard Value
  • the first m frame first chrominance equalization subunit 204 linearly corrects the chrominance value of the pixel value of the pixel value in [Xpi P3] using the reference value C'0 and the first chrominance weight adjustment coefficient C'C1.
  • the first m frame second chrominance equalization sub-unit 205 linearly corrects the chrominance values of the pixel values of the pixel values in the interval [P3, Ypi] using the reference value C'0 and the second chrominance weighting adjustment coefficient C'C2. , the chrominance value C' 2 of the current sub-picture video image on the [P3, Ypi] interval is obtained.
  • Application scenario 1 In the process of a video conference, multiple terminal sites are joined at the same time.
  • ⁇ , ⁇ , (: and 0, each terminal site will use a video for the video image of the site.
  • compression protocol compresses the compressed video stream is transmitted to the multipoint control unit (MCU, multipoint controlling unit) via the network control unit over 0 after each site terminal receiving compressed video stream, decoding module using the corresponding
  • the video compression protocol decodes each code stream to obtain a sub-picture video image (relative to the reconstructed multi-picture video image) required for reconstructing the multi-picture video image.
  • the sub-picture video image is input to the multi-picture video image processing apparatus.
  • the multi-point control unit After being processed by the multi-picture video image processing device, the multi-point control unit re-combines the multi-picture video images of the sub-picture images output by the multi-picture video image processing device to synthesize a multi-picture video. Image. Each sub-picture video image of the multi-picture video image corresponds to one site picture.
  • the encoding module re-encodes the multi-picture video image, and sends the encoded code stream to the receiving end, and the receiving end completes decoding and outputs to the display device to implement a multi-screen conference process.
  • Application scenario 2 During the video conference, multiple terminal sites join the conference.
  • the conference site A, B (: and 0, each terminal site will use a video for the video image of the site.
  • the compression protocol compresses and transmits the compressed video code stream to the multi-point control unit through the network.
  • the multi-point control unit After receiving the video compression code stream of each site terminal, the multi-point control unit uses the corresponding video compression protocol for each video.
  • the compressed code stream is decoded to obtain a sub-picture video image (relative to the reconstructed multi-picture video image) required for reconstructing the multi-picture video image.
  • the sub-picture video image is input into the multi-picture video image processing device, and the multi-picture video image is passed.
  • the encoding module After processing by the processing device, the encoding module encodes each sub-picture video image by using a video compression protocol, and sends the encoded code stream to different receiving ends, and the receiving end completes decoding and outputs to the display device, thereby realizing multiple times. Meeting process.
  • Application scenario 3 During the video conference, multiple terminal sites are joined at the same time.
  • the conference site A, B (: and 0, each terminal site will use a video for the video image of the site.
  • the compression protocol compresses and transmits the compressed video code stream to the multipoint control unit through the network.
  • the MCU After receiving the video compression code stream of each site terminal, the MCU uses the corresponding video compression protocol pair according to the format of the conference.
  • Each video compression code stream is decoded to obtain a sub-picture video image (relative to the reconstructed multi-picture video image) required for reconstructing the multi-picture video image, and then the encoding module re-uses the video compression for each sub-picture video image.
  • the protocol encodes, and the encoded code stream is sent to the receiving end, and the receiving end comprises a multi-picture video image processing device.
  • the sub-picture video image is processed by the multi-picture video image processing device and decoded by the receiving end, and then output to different display devices. , to achieve a multi-point meeting process.
  • the encoding module may not be required to be forwarded to the receiving end after being decoded by the decoding module, and after receiving the multi-channel code stream, the receiving end decodes and processes the multi-picture video image processing device to output to different display devices. , to achieve a multi-point meeting process.
  • the program can be stored in a computer readable storage medium.
  • the storage medium can include: Read Only Memory (ROM), Random Access Memory (RAM), disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé et un dispositif se rapportant au traitement d'une reproduction vidéo à images multiples, ce qui résout le problème rencontré dans les solutions de la technique antérieure qui, du fait que les caractéristiques de chaque sous-image ne peuvent pas être satisfaites simultanément, a pour résultat la dégradation de la vision expérimentée par le participant. Ledit procédé inclut : la réception du flux de codes de données des reproductions vidéo multicanaux des sous-images ; l'égalisation de l'effet desdites reproductions vidéo multicanaux des sous-images avec des paramètres de commande en fonction respectivement des caractéristiques de l'image des reproductions vidéo des sous-images ; la synthèse desdites reproductions vidéo multicanaux des sous-images égalisées en une reproduction vidéo à images multiples. La présente invention peut ajuster uniformément chaque reproduction vidéo de sous-images jusqu'à un effet présentant les mêmes caractéristiques d'image, ce qui permet aux reproductions vidéo de sous-images de montrer les mêmes caractéristiques d'image lorsqu'elles sont affichées, de telle sorte que l'affichage de la reproduction vidéo à images multiples constituée par des reproductions vidéo de sous-images permet d'atteindre une harmonie complète du style et améliore la vision expérimentée par le participant.
PCT/CN2011/070688 2010-01-29 2011-01-27 Procédé et dispositif pour traiter une reproduction vidéo à images multiples WO2011091755A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP11736637.7A EP2521351B1 (fr) 2010-01-29 2011-01-27 Procédé et dispositif pour traiter une reproduction vidéo à images multiples
US13/562,003 US8947498B2 (en) 2010-01-29 2012-07-30 Method and device for processing multi-picture video image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201010104939.4A CN101778246B (zh) 2010-01-29 2010-01-29 多画面视频图像处理方法和装置
CN201010104939.4 2010-01-29

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/562,003 Continuation US8947498B2 (en) 2010-01-29 2012-07-30 Method and device for processing multi-picture video image

Publications (1)

Publication Number Publication Date
WO2011091755A1 true WO2011091755A1 (fr) 2011-08-04

Family

ID=42514548

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/070688 WO2011091755A1 (fr) 2010-01-29 2011-01-27 Procédé et dispositif pour traiter une reproduction vidéo à images multiples

Country Status (4)

Country Link
US (1) US8947498B2 (fr)
EP (1) EP2521351B1 (fr)
CN (1) CN101778246B (fr)
WO (1) WO2011091755A1 (fr)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778246B (zh) 2010-01-29 2014-04-02 华为终端有限公司 多画面视频图像处理方法和装置
CN103096012B (zh) * 2011-11-08 2016-08-03 华为技术有限公司 调整图像显示的方法、设备及系统
CN102752534B (zh) * 2011-11-15 2017-11-10 新奥特(北京)视频技术有限公司 一种调色系统中对比静帧的方法
US8866876B2 (en) 2011-12-07 2014-10-21 Futurewei Technologies, Inc. Color correction for multiple video objects in telepresence applications
CN104572150B (zh) * 2013-10-21 2021-04-13 联想(北京)有限公司 一种信息处理方法和装置
US10368097B2 (en) * 2014-01-07 2019-07-30 Nokia Technologies Oy Apparatus, a method and a computer program product for coding and decoding chroma components of texture pictures for sample prediction of depth pictures
CN105100545A (zh) * 2014-05-23 2015-11-25 三亚中兴软件有限责任公司 画面亮度调整方法、多点控制单元mcu及终端
TWI530927B (zh) * 2014-06-06 2016-04-21 瑞軒科技股份有限公司 顯示亮度的調校方法、畫面顯示方法以及顯示裝置
JP6354578B2 (ja) * 2014-12-26 2018-07-11 株式会社Jvcケンウッド 撮像システム
US10397585B2 (en) 2015-06-08 2019-08-27 Qualcomm Incorporated Processing high dynamic range and wide color gamut video data for video coding
CN104954692B (zh) * 2015-06-30 2019-05-07 百度在线网络技术(北京)有限公司 通过终端设备控制相机拍摄的方法及装置
CN105096799A (zh) * 2015-08-07 2015-11-25 深圳市康冠商用科技有限公司 将多路画面中的各路画面进行独立调节的显示方法及系统
CN105392011B (zh) * 2015-11-05 2020-12-01 厦门雅迅网络股份有限公司 一种平衡不同视频解码器色彩的方法
CN106941598A (zh) * 2016-01-04 2017-07-11 中兴通讯股份有限公司 多画面码流合成方法、多画面码流合成控制方法及装置
CN107682587B (zh) * 2017-09-22 2019-08-06 北京嗨动视觉科技有限公司 视频处理器
CN108063932B (zh) * 2017-11-10 2020-10-27 广州极飞科技有限公司 一种光度标定的方法及装置
US10623791B2 (en) 2018-06-01 2020-04-14 At&T Intellectual Property I, L.P. Field of view prediction in live panoramic video streaming
US10812774B2 (en) 2018-06-06 2020-10-20 At&T Intellectual Property I, L.P. Methods and devices for adapting the rate of video content streaming
US10616621B2 (en) 2018-06-29 2020-04-07 At&T Intellectual Property I, L.P. Methods and devices for determining multipath routing for panoramic video content
US11019361B2 (en) 2018-08-13 2021-05-25 At&T Intellectual Property I, L.P. Methods, systems and devices for adjusting panoramic view of a camera for capturing video content
US10708494B2 (en) 2018-08-13 2020-07-07 At&T Intellectual Property I, L.P. Methods, systems and devices for adjusting panoramic video content
CN111402825B (zh) * 2020-03-31 2022-08-19 浙江宇视科技有限公司 一种屏幕校正方法、装置、系统和逻辑板
CN111770333B (zh) * 2020-07-17 2022-05-24 广州市奥威亚电子科技有限公司 一种图像合并方法和系统
CN113556504A (zh) * 2021-07-19 2021-10-26 西安万像电子科技有限公司 视频会议的显示图像的处理方法及装置、视频会议系统
CN115766973A (zh) * 2021-09-02 2023-03-07 北京字跳网络技术有限公司 一种视频拼接方法、装置、设备及介质
CN116506562B (zh) * 2023-06-27 2023-09-05 深圳市门钥匙科技有限公司 一种基于多通道的视频显示方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005065051A (ja) * 2003-08-18 2005-03-10 Sony Corp 撮像装置
CN1852414A (zh) * 2005-11-28 2006-10-25 华为技术有限公司 一种视频码流伽玛特性校正方法及多点控制单元
CN101179696A (zh) * 2007-12-04 2008-05-14 中兴通讯股份有限公司 远程调节会议电视终端图像参数的方法、系统及设备
CN101778246A (zh) * 2010-01-29 2010-07-14 华为终端有限公司 多画面视频图像处理方法和装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0124387B1 (ko) * 1993-12-07 1997-12-01 김광호 픽쳐인픽쳐기능을 구비한 영상디스플레이기기의 화면상태안정화 방법 및 그 장치
FR2837056B1 (fr) * 2002-03-07 2004-09-17 France Telecom Procede et systeme d'uniformisation du rendu colorimetrique d'une juxtaposition de surfaces d'affichage
JP3954550B2 (ja) * 2003-09-03 2007-08-08 オリンパス株式会社 画像表示プログラム、画像表示装置、画像表示方法
CN101052125A (zh) 2006-04-05 2007-10-10 深圳Tcl新技术有限公司 自适应多画面分割装置
US20080049034A1 (en) * 2006-07-12 2008-02-28 Daniel Chin Uniform image display for multiple display devices
CN100589583C (zh) * 2007-06-05 2010-02-10 广东威创视讯科技股份有限公司 一种多屏幕拼墙校正方法
US20090167782A1 (en) * 2008-01-02 2009-07-02 Panavision International, L.P. Correction of color differences in multi-screen displays
CN101635137B (zh) * 2008-07-23 2012-02-08 深圳市巨烽显示科技有限公司 图像无缝显示方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005065051A (ja) * 2003-08-18 2005-03-10 Sony Corp 撮像装置
CN1852414A (zh) * 2005-11-28 2006-10-25 华为技术有限公司 一种视频码流伽玛特性校正方法及多点控制单元
CN101179696A (zh) * 2007-12-04 2008-05-14 中兴通讯股份有限公司 远程调节会议电视终端图像参数的方法、系统及设备
CN101778246A (zh) * 2010-01-29 2010-07-14 华为终端有限公司 多画面视频图像处理方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2521351A4 *

Also Published As

Publication number Publication date
US20120293603A1 (en) 2012-11-22
CN101778246A (zh) 2010-07-14
EP2521351A4 (fr) 2013-01-02
US8947498B2 (en) 2015-02-03
EP2521351B1 (fr) 2016-01-27
EP2521351A1 (fr) 2012-11-07
CN101778246B (zh) 2014-04-02

Similar Documents

Publication Publication Date Title
WO2011091755A1 (fr) Procédé et dispositif pour traiter une reproduction vidéo à images multiples
US7113200B2 (en) Method and system for preparing video communication image for wide screen display
US8885014B2 (en) Appearance matching for videoconferencing
US7646736B2 (en) Video conferencing system
US10264193B2 (en) System and method for providing images and video having high dynamic range
US8289371B2 (en) Smart cropping of video images in a videoconferencing session
US9049340B2 (en) Method and apparatus for adjusting site bandwidth, conferencing terminal, and media control server
US7720157B2 (en) Arrangement and method for generating CP images
US20060259552A1 (en) Live video icons for signal selection in a videoconferencing system
US8947490B2 (en) Method and apparatus for video processing for improved video compression
WO2005069619A1 (fr) Systeme de visioconference
US20060171336A1 (en) Video multi-conference unit (MCU)
US8836753B2 (en) Method, apparatus, and system for processing cascade conference sites in cascade conference
CN109963156B (zh) 用于控制图像编码的方法及控制器、电子设备
US10674163B2 (en) Color space compression
JP2008005349A (ja) 映像符号化装置、映像伝送装置、映像符号化方法及び映像伝送方法
CN112399126A (zh) 视频处理方法、装置、终端设备和存储介质
WO2021254452A1 (fr) Procédé de commande d'un système de visioconférence, ainsi qu'unité de commande multipoint et support de stockage
WO2006064250A1 (fr) Ecrans d'affichage sans scintillement a bande passante reduite
JPH07131769A (ja) 画像通信端末装置
JP2011234067A (ja) Tv電話システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11736637

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011736637

Country of ref document: EP