WO2017113850A1 - Procédé et appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique - Google Patents

Procédé et appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique Download PDF

Info

Publication number
WO2017113850A1
WO2017113850A1 PCT/CN2016/097696 CN2016097696W WO2017113850A1 WO 2017113850 A1 WO2017113850 A1 WO 2017113850A1 CN 2016097696 W CN2016097696 W CN 2016097696W WO 2017113850 A1 WO2017113850 A1 WO 2017113850A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
parallax
parameter
stereoscopic
eye image
Prior art date
Application number
PCT/CN2016/097696
Other languages
English (en)
Chinese (zh)
Inventor
楚明磊
Original Assignee
乐视控股(北京)有限公司
乐视致新电子科技(天津)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视致新电子科技(天津)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017113850A1 publication Critical patent/WO2017113850A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals

Definitions

  • the present application relates to the field of image processing, for example, to a method and apparatus for acquiring stereo disc source parallax parameters.
  • 3Dimensional (3D) technology With the development of 3Dimensional (3D) technology, the number of stereoscopic sources is increasing, and more and more devices supporting viewing stereoscopic sources are also available.
  • the principle of 3D technology is to simultaneously capture the same object through two shooting devices at the same horizontal line but separated by a certain horizontal distance, and obtain two images of the left and right corresponding to the object.
  • the stereoscopic film source When the stereoscopic film source is played, the left side of the left and right eyes respectively allows the user to view the left image and the right eye to view the right image. Since there is a horizontal interval between the photographing devices for simulating the pupil distance of the human eye, there is a certain parallax between the objects respectively presented by the left and right images.
  • the left and right images are simultaneously transmitted to the brain through the retina, and the brain uses the parallax to generate the depth of field effect of the object, thereby obtaining a stereoscopic effect.
  • the stereo source itself does not provide a parallax parameter, and manual calculation is required to obtain the parallax parameter.
  • the relevant person finds the same object in the left and right images respectively, for example, finds the object "water cup” in the left and right images, and then manually calculates the number of pixels in the horizontal direction of the two "cups" to obtain the horizontal distance on the X axis. , thus obtaining the parallax parameter between the two "cups".
  • the stereoscopic degree of different objects in the 3D image is different, so the horizontal distances of different objects in the left and right images are different from each other.
  • the parallax parameters are acquired, the professional needs to select and calculate different objects separately, and obtain multiple parallax parameters. And select the maximum and minimum values of the parallax parameters to determine the range of parallax variation of the stereoscopic source.
  • the related technology mainly relies on the subjective recognition and judgment of the relevant personnel to perform object selection, and the calculation of the horizontal distance is also manually performed by a professional, when the number of objects in the 3D image is large, the efficiency of acquiring the parallax parameter is very low.
  • the related method can still process static stereoscopic pictures, but for stereoscopic video composed of hundreds of thousands of frames or even millions of frames, if the parallax parameters of each frame are manually obtained manually, the time cost will be very high. Big.
  • the present application provides a method and apparatus for obtaining a parallax parameter of a stereoscopic slice source, and solves the problem of inefficiently obtaining a parallax parameter of a stereoscopic slice source.
  • the embodiment of the present application provides a method for obtaining a parallax parameter of a stereoscopic slice, including:
  • Reading an image frame from the stereoscopic source file the image frame including a left eye image and a right eye image;
  • the maximum parallax parameter and the minimum parallax parameter are determined to obtain a parallax variation range of the stereoscopic slice source.
  • the embodiment of the present application further provides an apparatus for acquiring a parallax parameter of a stereoscopic slice, including:
  • a reading unit configured to read an image frame from the stereoscopic source file, the image frame including a left eye image and a right eye image;
  • a region determining unit configured to determine a plurality of first image regions in one of a left eye image and a right eye image
  • a searching unit configured to search for a plurality of second image regions corresponding to the plurality of first image region contents in another of the left eye image and the right eye image, the first image region and the second image region size corresponding to each other the same;
  • a calculating unit configured to separately calculate a horizontal distance between each of the first image regions and the second image region corresponding thereto, to obtain a plurality of parallax parameters
  • the parameter determining unit is configured to determine a maximum parallax parameter and a minimum parallax parameter to obtain a parallax variation range of the stereoscopic slice source.
  • an embodiment of the present application further provides an electronic device, including:
  • At least one processor and,
  • the memory stores instructions executable by the one processor, the instructions being executed by the at least one processor to enable the at least one processor to:
  • the maximum parallax parameter and the minimum parallax parameter are determined to obtain a parallax variation range of the stereoscopic slice source.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer instructions, where the computer instructions are used to cause the computer to execute the above A method of obtaining a stereoscopic source parallax parameter.
  • the embodiment of the present application further provides a computer program product, including a computing program stored on a non-transitory computer readable storage medium, the computer program including program instructions, when the program instructions are executed by a computer And causing the computer to perform the above-described method of acquiring a stereoscopic source parallax parameter.
  • the method and device for acquiring a stereoscopic source parallax parameter can automatically read an image frame from a stereoscopic source file, and determine a plurality of identical and one-to-one correspondences in the left and right eye images in the image frame. a first image area and a second image area, respectively calculating a horizontal distance of each pair of the first and second image areas, obtaining a parallax parameter corresponding to each pair of the first and second image areas, and then from the plurality of parallax parameters A maximum parallax parameter and a minimum parallax parameter are determined, thereby obtaining a parallax variation range of the stereoscopic sheet source.
  • the time taken to obtain parallax parameters is greatly shortened, and the processing efficiency is improved.
  • FIG. 1 is a flowchart of a method for obtaining a parallax parameter of a stereoscopic slice according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of another method for obtaining a stereoscopic source parallax parameter according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart of still another method for obtaining a parallax parameter of a stereoscopic slice according to an embodiment of the present disclosure
  • 4a to 4d are schematic diagrams of determining a first image area according to an embodiment of the present application.
  • 4 e is a schematic diagram of determining a second image area according to an embodiment of the present application.
  • 5a to 5d are schematic diagrams of determining pixel points according to an embodiment of the present application.
  • FIG. 6 is a structural block diagram of an apparatus for acquiring a parallax parameter of a stereoscopic slice according to an embodiment of the present disclosure
  • FIG. 7 is a structural block diagram of another apparatus for acquiring a stereoscopic source parallax parameter according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a physical structure of an electronic device for acquiring a parallax parameter of a stereoscopic slice according to an embodiment of the present disclosure.
  • An embodiment of the present application provides a method for obtaining a parallax parameter of a stereoscopic slice. As shown in FIG. 1 , the method includes:
  • step 110 an image frame is read from the stereoscopic source file.
  • the stereoscopic source in this embodiment may be a stereoscopic image or a stereoscopic video.
  • the stereoscopic source file usually includes only one image, and for stereoscopic video, the stereoscopic source file generally includes many images. frame.
  • the embodiment refers to an image in a stereoscopic picture as an image frame.
  • the stereoscopic film source is obtained by two imaging devices at the same horizontal line and separated by a certain distance, so whether it is a stereoscopic picture or a stereoscopic video, each image frame is composed of a left eye image and a The right eye image is composed of. Since there is a horizontal interval between the two imaging devices for simulating the eyelid distance of the human eye, both the left eye image and the right eye image have a certain positional offset in the horizontal direction, and this positional offset is the implementation.
  • the computer can read the stereoscopic source file from the storage area or the database according to the default storage path, or directly receive the stereoscopic source file sent by the network side through the preset application interface, or through a preset operation.
  • the interface receives the stereoscopic source file uploaded by the user. This embodiment does not specifically limit the manner in which the stereoscopic source file is obtained.
  • step 120 a plurality of first image regions are determined in one of the left eye image and the right eye image.
  • a prerequisite for obtaining the parallax parameter is to determine two image regions corresponding to the same content, that is, the first image region and the second image region, respectively, in the left and right images.
  • the first image region is first determined therein based on any one of the left and right images, and then the second image region is determined correspondingly in the other image.
  • the present embodiment will be described by taking the first image region in the left eye image as an example. Of course, in the actual application, the first image region may be first determined based on the right eye image.
  • the manner of obtaining the parallax parameter is the same as that of the former method, and the present embodiment does not repeat the description.
  • the disparity of all content in the left and right images should be consistent, that is, pedestrian 1 and pedestrian 2 (note: pedestrian 1 is the pedestrian in the left eye image, pedestrian 2 is the corresponding pedestrian in the right eye image, the same) Parallax, the parallax of car 1 and car 2 should be the same, but because the depth of field of different objects is different, the disparity of different content may be different in actual situation, which needs to be determined in the left eye image.
  • a plurality of first image regions are determined, and a plurality of second image regions are correspondingly determined in the subsequent right eye images, and thereby a plurality of parallax parameters are calculated.
  • the objects in the left eye image can be identified by image recognition technology, and then different first image regions including different objects are determined according to the position and size of the different objects.
  • the parallax parameters between the objects can be accurately calculated from the naturally formed objects, but because of the use of image recognition technology, a large amount of calculation is involved in practical applications.
  • the first image region may also be subjected to blur determination in the embodiment, that is, the first image region is directly determined without the object recognition as a premise requirement for determining the first image region.
  • a plurality of first image regions are determined by randomly selecting positions in the left eye image, or several first image regions are determined at specific positions according to the size ratio of the left eye image. This method does not require recognition of the content of the image, and is faster to implement.
  • step 130 a plurality of second image regions corresponding to the plurality of first image region contents are searched for in another of the left eye image and the right eye image.
  • the content correspondence means that the content displayed in the second image area is the same as the content displayed in the first image area. From the perspective of the control variable, only the difference in position between the same content and the left and right images can be used to reflect the disparity of the content in the left and right images. Therefore, the content in the first and second image regions is required in this embodiment. It is corresponding. Under this premise, the first image area and the second image area should be identical at least in the following aspects: 1. The content is the same; 2. The size of the area is the same; 3. The number of areas is the same.
  • the computer can find the content corresponding to the content of the first image region in the right eye image through an image matching technology (for example, the method provided in OpenCV), and determine the location of the second image region according to the found content.
  • an image matching technology for example, the method provided in OpenCV
  • the second image region is determined in the right eye image
  • the content in the right eye image needs to be identified by the image matching technology, and the purpose is to make the determined second image region and the left eye image.
  • the first image regions correspond to each other in content, and as a precondition for calculating the parallax parameters, such a process of content recognition is necessary.
  • the image recognition process in step 120 may be omitted to improve the processing speed. .
  • step 140 a horizontal distance between each of the first image regions and the second image region corresponding thereto is separately calculated to obtain a plurality of parallax parameters.
  • step 120 After obtaining the first image area and the second image area of equal numbers, respectively calculating a horizontal distance between the first image area and the second image area corresponding to the content, thereby obtaining a parallax parameter corresponding to each pair of the first/second image areas .
  • a parallax parameter corresponding to each pair of the first/second image areas Exemplarily, assuming that four first image regions are determined in step 120, then corresponding four second image regions are determined in step 130.
  • the parallax parameter c between the two image regions 3 and the parallax parameter d between the first image region 4 and the second image region 4 obtain four parallax parameters equal in number to the first image region or the second image region.
  • the horizontal distance may be calculated based on a certain pixel point in the first image area and the second image area, or the horizontal distance may be calculated based on the same area edge of the first image area and the second image area.
  • the horizontal distance may also be calculated based on the entirety of the first image area and the second image area. This embodiment does not specifically limit the object for calculating the horizontal distance reference.
  • the horizontal distance refers to the difference in distance between the first image area and the second image area on X
  • the unit may be the number of pixels, or may be a unit of length in a general sense, such as millimeters, centimeters, and the like. This embodiment does not limit this, but in general, the distance difference is positive and negative.
  • step 150 the maximum parallax parameter and the minimum parallax parameter are determined to obtain a parallax variation range of the stereoscopic slice source.
  • the disparity between different content may be different.
  • obtaining the parallax variation range of the entire image frame is more practical than obtaining the parallax parameter of each region in the image frame. Therefore, in this step, the computer finds a maximum parallax parameter and a minimum parallax parameter from all the obtained parallax parameters, and determines the parallax variation range of the image frame as the upper and lower boundary values of the parallax variation range.
  • the range of parallax variation obtained in this step is the range of parallax variation of the stereoscopic source; for stereoscopic video, since it contains many image frames, all can be All the parallax parameters in the image frame are put together, and a maximum parallax parameter and a minimum parallax parameter are found out to determine the range of parallax variation of the stereoscopic video.
  • the embodiment of the present application will provide a method for performing parallax parameter acquisition on a stereoscopic slice source including a plurality of image frames.
  • a stereoscopic source containing a plurality of image frames is usually a stereoscopic video, but it is not excluded that a partial stereoscopic image is composed of at least two image frames, and the latter is equally applicable to the method.
  • the method includes:
  • step 210 at least two image frames are extracted from the stereoscopic source file as sample frames.
  • the computer calculates a parallax parameter for at least two image frames in the stereoscopic source file, and puts the parallax parameters of all the image frames together, and selects a maximum parallax parameter and a minimum parallax parameter.
  • the computer extracts image frames from the stereoscopic source file in accordance with predetermined rules. Extracting several image frames and extracting which image frames are defined by the preset rule. The following two implementations of extracting image stitches are given:
  • Mode 1 is extracted according to sampling interval
  • the image frames have a certain order in the stereo source files. This order is generally determined by the order in which the image frames are played.
  • at least two sampling frames may be extracted according to the predetermined sampling interval based on the ranking.
  • the sampling interval may be an interval frame number or an interval time.
  • it may be specified that one image frame is extracted every four image frames from the first image frame, or one image frame is extracted every 100 image frames; for the latter implementation, the image frame may be played according to the image frame. Timing, starting with the moment when the first image frame is played, extracting one image frame every 100 milliseconds, or extracting one image frame every 15000 milliseconds.
  • the specific numerical values in this mode are for illustrative purposes only and are not intended as a limitation of the actual application.
  • the computer can extract the sampled frames using a random number generation algorithm.
  • the computer randomly generates a random number whose value is not greater than the total number of image frames by using a random number generating function, and extracts an image frame with the same image frame number and the random number as the extracted frame.
  • the computer may randomly generate a floating-point random number greater than 0 and less than or equal to 1 by using a random number generating function, and then multiply the random number by the total number of image frames to obtain a random extraction.
  • the number of image frames this method needs to optimize the algorithm to ensure that the product of the random number and the total number of image frames is a positive integer.
  • the computer can also receive the extraction rule set by the staff through a preset human-computer interaction mode, and the extraction rule can only limit the number of extracted image frames, and can also specifically define which specific image frames are extracted. This embodiment does not limit this. In an extreme mode of this embodiment, all image frames in the stereoscopic source file can be extracted.
  • step 220 parallax parameters for each sample frame are obtained separately.
  • the computer performs step 120 to step 140 in FIG. 1 for each sample frame to obtain the parallax parameters of each sample frame.
  • step 120 to step 140 in FIG. 1 for each sample frame to obtain the parallax parameters of each sample frame.
  • each sampling frame needs to calculate 4 parallax parameters, that is, 4 first/second image regions are respectively determined in the left and right images of each sampling frame.
  • the number of parallax parameters obtained by calculating each sampling frame may be For the same, for example, four of the above examples, the number of differentiated parallax parameters may also be determined for different sampling frames, for example, three parallax parameters are acquired for a partial sampling frame, and seven parallax parameters are acquired for another partial sampling frame.
  • This embodiment does not limit the number of disparity parameters acquired in each sampling frame, and does not limit the number of disparity parameters acquired in each sampling frame to be the same.
  • the computer may obtain the parallax parameters for each sampling frame according to the sampling frame number from small to large, or may obtain the parallax parameters according to the sequence of the extracted sampling frames. In practical applications, when there are enough idle computing resources, the computer can simultaneously obtain parallax parameters for some or all of the sampled frames through a preset number of parallel threads.
  • step 230 among the parallax parameters of all the sampling frames, the maximum parallax parameter and the minimum parallax parameter are determined, and the parallax variation range of the stereoscopic slice source is obtained.
  • the maximum disparity parameter and the minimum disparity parameter are selected from all disparity parameters of all sample frames.
  • the computer selects one of the 520 parallax parameters with the largest parallax parameter and the smallest value parallax parameter.
  • the embodiment of the present application further provides a method for obtaining a parallax parameter of a stereoscopic slice source.
  • the method is described in terms of obtaining parallax parameters from an image frame. It should be clarified that although the expression angle has certain limitations, it does not affect the application of the method in the case of FIG. As shown in FIG. 3, the method includes:
  • step 310 an image frame is read from the stereoscopic source file.
  • step 320 a plurality of preset image size first regions are determined at preset positions in one of the left eye image and the right eye image.
  • the computer determines a plurality of preset image first image regions from preset positions in the left eye image.
  • Variables that can be automatically set by the computer or manually set manually include: 1. coordinates of the preset position; 2. the number of preset positions (ie, the number of first image areas); 3. size of the first image area .
  • the computer randomly selects three rectangular regions in the left eye image by using a random algorithm, and determines the first image region (the length and width values of the rectangle are not shown). .
  • the computer selects four square regions according to the layout position of the four-square grid, and determines the first image region.
  • the size of the square is reduced by a preset ratio based on the size of the square grid (the square length value of the square is not shown).
  • the computer is arranged according to the layout of the nine squares. Position, 9 square areas are selected and determined as the first image area. The size of the square is reduced by a preset ratio based on the size of the nine squares (the square length value of the square is not shown).
  • the computer selects eight square regions in the diagonal position of the left eye image, and determines the first image region (the square length value of the square is not shown). Out).
  • step 330 the first image area is filtered to eliminate the first image area whose pixel chromaticity is less than a predetermined degree of dispersion.
  • the determined first image region may be filtered to remove the image region with a single chroma.
  • a single chrominance image region for example, an all white or all green image region
  • the chromaticities of the respective pixels in the image region are the same or similar
  • the horizontal offset of the image cannot be accurately determined, thereby making the calculated parallax parameter larger or smaller than the actual parallax parameter. Therefore, in this step, the first image region with a single chroma can be eliminated, thereby eliminating inaccurate parallax parameters.
  • the image region of a single chromaticity includes not only an image region (for example, an all white region) involving only one chromaticity, but also an image region involving a plurality of chromaticities but relatively close chromaticity.
  • image region involving blue sky and white clouds although it contains both white and blue hues, in some specific climatic conditions, the chromaticity of white clouds is very similar to the chromaticity of the blue sky as the background, there is no obvious difference between the two. For this image area, this step will also be rejected.
  • the computer can cull the first image region of a single chrominance by calculating the degree of dispersion of the pixel chromaticity.
  • the degree of dispersion of pixel chrominance reflects the degree of chromaticity difference between pixels in the image region. The greater the degree of dispersion, the greater the difference in pixel chromaticity.
  • the preset dispersion degree is used as a boundary condition for distinguishing a single chrominance image region and an image region whose chromaticity dispersion degree satisfies the calculation requirement. In actual application, the preset dispersion degree can be obtained by the computer according to the accuracy of the historical parallax parameter, and also It can be obtained by the staff based on experience.
  • the degree of dispersion of pixel chrominance can be quantified by standard deviation, and the standard deviation is used to characterize the degree of deviation of all values relative to the average, wherein the average is the average of all values.
  • the value, the degree of deviation is derived from the individual difference contribution of each value to the mean. Therefore, the standard deviation can be reflected as a whole for the degree of dispersion of all values (relative mean).
  • the computer calculates that each pixel in the first image region corresponds to a standard difference of three color components of red, green, and blue, and obtains three differences of a red standard deviation value, a green standard deviation value, and a blue standard deviation value. Value parameter.
  • the standard deviation reflects the degree of deviation of the red color chromaticity mean between different pixels on the red component. The larger the red standard deviation, the greater the dispersion of each pixel on the red component. The chromaticity of the entire image region on the red component is not uniform.
  • the computer sums the three to obtain the comprehensive standard deviation value, and then compares the integrated standard deviation value with the preset standard deviation threshold value. The first image area whose integrated standard deviation is less than the standard deviation threshold is eliminated.
  • is the standard deviation value
  • N is the number of pixel points in the first image region
  • x i is the chromaticity value of the ith pixel point
  • is the chromaticity mean of all the pixel points.
  • the computer calculates the standard deviation of all pixels on the red component by the following formula:
  • ⁇ r is the standard deviation of the first image region on the red component
  • x ri is the chromaticity value of the i-th pixel on the red component
  • ⁇ r is the red component of all pixels The average of the chromaticity.
  • the computer calculates the standard deviation of all pixels on the green component by the following formula:
  • ⁇ g is the standard deviation of the first image region on the green component
  • x gi is the chromaticity value of the ith pixel on the green component
  • ⁇ g is the pixel on all the green components The average of the chromaticity.
  • the computer then calculates the standard deviation of all pixels on the blue component by the following formula:
  • ⁇ b is the standard deviation of the first image region on the blue component
  • x bi is the chromaticity value of the ith pixel on the blue component
  • ⁇ b is the pixel of all pixels The chromaticity mean on the blue component.
  • the computer obtains standard deviation values ⁇ r , ⁇ g and ⁇ b corresponding to the three color components of the first image region. Then ⁇ r , ⁇ g and ⁇ b are summed to obtain a comprehensive standard deviation ⁇ c . Finally, the computer compares ⁇ c with a preset standard deviation threshold. If ⁇ c is greater than the standard deviation threshold, the dispersion of the pixel chromaticity in the first image region is sufficiently large to retain the first image region; If ⁇ c is smaller than the standard deviation threshold, it indicates that the degree of dispersion of the pixel chromaticity in the first image region is not large enough, and the first image region is eliminated. This completes the screening of the first image area. In the example shown in Figure 4a, the first image area in the lower right corner will be rejected.
  • V is the average difference of a certain color component
  • the characters and arithmetic symbols in the absolute value symbol correspond to the content in the foregoing formula.
  • the average difference algorithm can reduce the calculation amount of the computer to some extent, but the accuracy of the standard deviation algorithm is higher from the perspective of reflecting the degree of color deviation.
  • the edge of the object in the image may be identified according to the degree of change in the brightness of the pixel, and the first image region containing less objects may be removed accordingly.
  • step 330 is an optional step in FIG. 3, and in actual application, step 340 may be directly executed after step 320 is performed, that is, the first image area is not filtered.
  • step 340 a plurality of second image regions corresponding to the plurality of first image region contents are searched for in the other of the left eye image and the right eye image.
  • the other image is a right eye image
  • the computer determines a plurality of second image regions in the right eye image, the number and size of the second image regions, and the number and size of the first image regions in the left eye image. the same.
  • the position of the second image area in the right eye image is generally different from the position of the first image area in the left eye image due to the horizontal spacing between the left and right images.
  • the positions of the two image regions in the left and right images are the same.
  • the computer needs to ensure that the content involved in the second image area is the same as the content involved in the first image area, and the computer searches for the content involved in the first image area in the right eye image through image matching technology, and then in the content Upper, the second image area is determined according to the size of the first image area.
  • FIG. 4e shows the second image area determined in the right eye image, and it can be seen that the two correspond to the same number, size and content as compared with the first image area in the left eye image. However, due to the horizontal spacing, the position of the second image area is slightly to the left relative to the first image area.
  • step 350 in the first image region and the second image region corresponding to each other, select the same Pixels.
  • the computer selects the same pixel point from the two image areas respectively.
  • the same pixel refers to a pixel having the same relative coordinate in the image region, for example, the fourth row and the fifth pixel in the first image region, and the fourth row and the fifth pixel in the second image region. Two pixels belong to the same pixel. Since the sizes of the first image area and the second image area are the same, the concept of using relative coordinates ensures that the selected pixels are the same.
  • the computer determined pixel points may be as shown in Figure 5a.
  • it is also possible to select pixels at a particular location such as a pixel on the four corners of the image area, a central pixel in the image area, or a pixel on the edge of the image area. The latter three cases are shown in Figures 5b, 5c and 5d, respectively.
  • step 360 the coordinate difference of the pixel on the X-axis is calculated to obtain a parallax parameter.
  • the computer After determining the pixel point, the computer calculates the coordinate difference value of the two pixel points on the X-axis, that is, obtains the parallax parameter corresponding to the pair of first/second image regions.
  • the computer sequentially performs steps 350 and 360 for each pair of first/second image regions to obtain a plurality of parallax parameters corresponding to the number of first/second image regions.
  • the computer can obtain the absolute coordinates of the two pixel points in the left and right images, and then subtract the X-axis component values of the two absolute coordinates to obtain the coordinate difference of the pixel points on the X-axis.
  • step 370 the maximum parallax parameter and the minimum parallax parameter that exceed the reasonable range of the preset parallax are eliminated.
  • the parallax parameter has positive and negative points. This difference is used to reflect whether the second image area is shifted to the left or to the right relative to the first image area. Based on different reference standards, the leftward and rightward offsets visually represent the object protrusions and depressions, respectively, or the object depressions and protrusions, respectively.
  • the absolute value of the parallax parameter reflects the stereoscopic degree of the object protruding or concave.
  • the stereoscopic degree of the object in the stereoscopic source is limited, and the degree of convexity and concavity is too large to make the user's eyes feel uncomfortable. Based on this, when the maximum parallax parameter obtained in the above step is too large, or the minimum parallax parameter is too small (negative value), the calculation result of the parallax parameter is inaccurate, and this part needs to be regarded as The difference parameter is removed.
  • the computer searches for all the parallax parameters whose value is greater than 0, and calculates the first parallax average value, and takes the preset multiple of the first parallax average value as the boundary condition for rejecting the maximum parallax parameter. If the maximum parallax parameter is less than a preset multiple of the first disparity average, the maximum disparity parameter is retained, otherwise the maximum disparity parameter is eliminated. After rejecting the maximum parallax parameter that does not meet the requirement, the computer further searches for the maximum parallax parameter in the remaining parallax parameters, and performs the tradeoff according to the above manner until there is no parallax parameter larger than the above boundary condition, and the computer will remain the parallax at this time. The parallax parameter with the largest value in the parameter is determined as the final maximum parallax parameter.
  • the computer searches for all the parallax parameters whose value is less than 0, and outputs the second parallax average value, and takes the preset multiple of the second parallax average value as the boundary condition for eliminating the minimum parallax parameter. If the minimum disparity parameter is less than a preset multiple of the second disparity average, the minimum disparity parameter is retained, otherwise the minimum disparity parameter is eliminated. After rejecting the minimum parallax parameter that does not meet the requirement, the computer further finds the minimum parallax parameter in the remaining parallax parameters, and performs the tradeoff according to the above manner until the parallax parameter less than the above boundary condition does not exist, and the computer will remain the parallax at this time. The parallax parameter with the smallest value in the parameter is determined as the final minimum parallax parameter.
  • the above preset multiple can take “1" or "2".
  • An example of this step is given below:
  • the computer obtains six parallax parameters, "+3", “+6”, “+30”, “-1”, “-2”, and “-12”.
  • the parallax parameters "+3”, “+6”, and “+30” are averaged to obtain a first parallax average "+13", and then the first parallax average "+13” is multiplied by a preset multiple "2" Get "+26". Compared with "+26", the parallax parameter "+30" exceeds this value, and the computer rejects it.
  • the parallax parameter "+6” does not exceed this value, and the computer determines it as the maximum parallax parameter.
  • the computer then averages the parallax parameters "-1", "-2", and "-12” to obtain a second parallax average "-5", and then sets the second parallax average "-5" with a preset multiple "2". Multiply by "-10". Compared with “-10”, the parallax parameter "-12” exceeds this value, and the computer rejects it. The parallax parameter "-2" does not exceed this value, and the computer determines it as the minimum parallax parameter.
  • step 370 also pertains to the optional steps in the flow shown in FIG.
  • step 380 is directly executed after step 360 is performed; when step 370 is performed, the range of parallax variation obtained by the process shown in FIG. 3 is more accurate.
  • a parallax variation range of the stereoscopic slice source is defined based on the obtained maximum parallax parameter and minimum parallax parameter.
  • the final determined maximum parallax parameter is "+6" and the minimum parallax parameter is "-2", whereby the parallax variation range of the stereoscopic sheet source can be obtained as [-2, +6].
  • the embodiment of the present application further provides an apparatus for acquiring a parallax parameter of a stereoscopic slice source.
  • the device may be integrated in the form of software or hardware on the client side, or on the server side, to perform the calculation of the inspection parameters for the stereoscopic source provided by the local storage or the external database.
  • the apparatus includes: a reading unit 610, an area determining unit 620, a searching unit 630, a calculating unit 640, and a parameter determining unit 650;
  • the reading unit 610 is configured to read an image frame from the stereoscopic source file, the image frame including a left eye image and a right eye image;
  • the area determining unit 620 is configured to determine a plurality of first image areas in one of the left eye image and the right eye image;
  • the searching unit 630 is configured to search, in another of the left eye image and the right eye image, a plurality of second image regions corresponding to the plurality of first image region contents, the first image region and the second image region corresponding to each other Same size;
  • the calculating unit 640 is configured to separately calculate a horizontal distance between each of the first image regions and the second image region corresponding thereto, to obtain a plurality of parallax parameters;
  • the parameter determining unit 650 is configured to determine the maximum parallax parameter and the minimum parallax parameter to obtain a parallax variation range of the stereoscopic slice source.
  • the device further includes:
  • the extracting unit 660 is configured to: when there are multiple image frames in the stereoscopic source file, extract at least two image frames from the stereoscopic source file as sampling frames;
  • the calculating unit 640 is configured to obtain a parallax parameter of each sampling frame separately;
  • the parameter determining unit 650 is configured to determine a maximum parallax parameter and a minimum parallax parameter among the parallax parameters of all the sampling frames to obtain a parallax variation range of the stereoscopic slice source.
  • the extracting unit 660 includes:
  • the interval extraction sub-unit 6610 is configured to extract at least two sampling frames according to a preset sampling interval based on the ordering of the image frames in the stereoscopic source file, and the sampling interval includes the interval frame number and/or the interval duration; or
  • the random extraction sub-unit 6620 is arranged to randomly extract at least two sample frames by a random number generation algorithm.
  • the area determining unit 620 is configured to determine a plurality of preset image size first regions in a preset position in one of the left eye image and the right eye image; wherein the preset position is randomly determined, or Calculate according to the size ratio of the image.
  • the device further includes:
  • the screening unit 670 is configured to perform the first image region after determining the plurality of first image regions The first image area in which the degree of dispersion of the pixel chromaticity is less than the preset degree of dispersion is excluded.
  • the screening unit 670 includes:
  • the first calculating sub-unit 6710 is configured to calculate a standard difference value of the three color components corresponding to the red, green and blue pixels in the first image region;
  • the second calculating subunit 6720 is configured to perform a summation calculation on the red standard deviation value, the green standard deviation value, and the blue standard deviation value to obtain a comprehensive standard deviation value;
  • the culling sub-unit 6730 is arranged to reject the first image region whose integrated standard deviation is less than the standard deviation threshold.
  • the computing unit 640 is configured to:
  • the parameter determining unit 650 is configured to reject the maximum parallax parameter and the minimum parallax parameter that exceed a reasonable range of the preset parallax.
  • the parameter determining unit 650 includes:
  • the first determining subunit 6510 is configured to calculate an average value of all the disparity parameters whose value is greater than 0, to obtain a first disparity average value, and retain the maximum parallax parameter when the maximum disparity parameter is less than a preset multiple of the first disparity average value, otherwise Eliminate the maximum parallax parameter;
  • the second determining subunit 6520 is configured to calculate an average value of all the disparity parameters whose value is less than 0, to obtain a second disparity average value, and when the minimum disparity parameter is greater than a preset multiple of the second disparity average value, retain the minimum parallax parameter, otherwise Eliminate the minimum parallax parameter.
  • the apparatus for acquiring the stereoscopic source parallax parameter can automatically read an image frame from the stereoscopic source file, and determine a plurality of identical and one-to-one correspondences in the left and right eye images in the image frame.
  • An image area and a second image area respectively calculate a horizontal distance of each pair of the first and second image areas, obtain a parallax parameter corresponding to each pair of the first and second image areas, and then determine one from the plurality of parallax parameters The maximum parallax parameter and a minimum parallax parameter, thereby obtaining a parallax variation range of the stereoscopic sheet source.
  • the processes of acquiring an image frame, determining an image area, and calculating a parallax parameter in the embodiment of the present application are all automatically completed by a computer, and the relevant personnel only need to provide a storage path of the stereoscopic source file to the computer. Since the manual operation is not involved, the embodiment of the present application can greatly shorten the time-consuming of obtaining parallax parameters and improve the processing efficiency, especially when processing stereoscopic video.
  • the functions of the unit or the subunit used in the embodiment of the present application can be implemented by a hardware processor.
  • FIG. 8 is a schematic diagram of a physical structure of an electronic device for acquiring a stereoscopic source parallax parameter according to an embodiment of the present application.
  • the electronic device may include: a processor 810 , A communication interface 820, a memory 830, and a bus 840, wherein the processor 810, the communication interface 820, and the memory 830 complete communication with each other through the bus 840.
  • Communication interface 820 can be arranged to transmit and/or receive information over a communication network.
  • the processor 810 can call the logic instructions in the memory 830 to perform any of the above methods for acquiring the stereoscopic source parallax parameters, including: reading an image frame from the stereoscopic source file, the image frame including the left eye image and the right eye image Determining a plurality of first image regions in the left eye image or the right eye image; searching a plurality of second image regions corresponding to the plurality of first image region contents in the other image, the first image regions corresponding to each other, and the first image region The two image regions have the same size; respectively calculate a horizontal distance between each of the first image regions and the second image region corresponding thereto, to obtain a plurality of parallax parameters; determine a maximum parallax parameter and a minimum parallax parameter, and obtain a parallax variation of the stereoscopic source range.
  • the logic instructions in the memory 830 described above may be implemented in the form of a software functional unit and sold or used as a stand-alone product, and may be stored in a computer readable storage medium.
  • the technical solution of the embodiment of the present application may be embodied in the form of a software product stored in a storage medium, including a plurality of instructions for causing a computer device (which may be a personal computer, a server, Or a network device or the like) performs all or part of the steps of the method described in the embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer instructions for causing the computer to execute any one of the above A method of obtaining a stereoscopic source parallax parameter.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the above embodiments can be implemented by means of software and a general hardware platform, and of course, by hardware.
  • the above technical solution may be embodied in the form of a software product, which may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including a plurality of fingers.
  • the method described in some of the above-described embodiments or embodiments is performed to cause a computer device (which may be a personal computer, server, or network device, etc.).
  • a computer device which may be a personal computer, server, or network device, etc.
  • various features in the embodiments of the present application may be arbitrarily combined with each other, and the technical solutions after the combination still fall within the protection scope of the present application.
  • the embodiment of the present application provides a method and a device for acquiring a parallax parameter of a stereoscopic slice source, which solves the problem of low efficiency when acquiring a parallax parameter of a stereoscopic slice source.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un procédé et un appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique. Le procédé consiste à : lire une trame d'image à partir d'un fichier source de film stéréoscopique, la trame d'image comprenant une image d'œil gauche et une image d'œil droit ; déterminer une pluralité de premières zones d'image dans l'une de l'image d'œil gauche et de l'image d'œil droit ; rechercher, dans l'autre de l'image d'œil gauche et de l'image d'œil droit, une pluralité de secondes zones d'image correspondant au contenu de la pluralité de premières zones d'image, la première zone d'image et la seconde zone d'image qui correspondent l'une à l'autre ayant la même taille ; calculer séparément une distance horizontale entre chaque première zone d'image et une seconde zone d'image lui correspondant, pour obtenir une pluralité de paramètres de parallaxe ; et déterminer un paramètre de parallaxe maximale et un paramètre de parallaxe minimale, pour obtenir une plage de variation de parallaxe d'une source de film stéréoscopique.
PCT/CN2016/097696 2015-12-28 2016-08-31 Procédé et appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique WO2017113850A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201511000911.5 2015-12-28
CN201511000911.5A CN105872516A (zh) 2015-12-28 2015-12-28 获取立体片源视差参数的方法及装置

Publications (1)

Publication Number Publication Date
WO2017113850A1 true WO2017113850A1 (fr) 2017-07-06

Family

ID=56624389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/097696 WO2017113850A1 (fr) 2015-12-28 2016-08-31 Procédé et appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique

Country Status (2)

Country Link
CN (1) CN105872516A (fr)
WO (1) WO2017113850A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111026641A (zh) * 2019-11-14 2020-04-17 北京云聚智慧科技有限公司 一种图片比较方法和电子设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872516A (zh) * 2015-12-28 2016-08-17 乐视致新电子科技(天津)有限公司 获取立体片源视差参数的方法及装置
CN110007475A (zh) * 2019-04-17 2019-07-12 万维云视(上海)数码科技有限公司 利用虚拟深度补偿视力的方法与装置
CN112255536B (zh) * 2020-09-21 2023-05-26 山东产研鲲云人工智能研究院有限公司 开关故障的检测方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254348A (zh) * 2011-07-25 2011-11-23 北京航空航天大学 一种基于块匹配视差估计的中间视图合成方法
US20120014590A1 (en) * 2010-06-25 2012-01-19 Qualcomm Incorporated Multi-resolution, multi-window disparity estimation in 3d video processing
US20120062548A1 (en) * 2010-09-14 2012-03-15 Sharp Laboratories Of America, Inc. Reducing viewing discomfort
CN102714751A (zh) * 2010-01-13 2012-10-03 夏普株式会社 在显示器上显示立体图像对以降低观看不适的方法和显示设备
CN104065954A (zh) * 2014-07-03 2014-09-24 中国传媒大学 一种高清立体视频的视差范围快速检测方法
US20150138323A1 (en) * 2011-08-03 2015-05-21 Sony Corporation Image processing device and method, and program
CN105872516A (zh) * 2015-12-28 2016-08-17 乐视致新电子科技(天津)有限公司 获取立体片源视差参数的方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102006493A (zh) * 2010-11-26 2011-04-06 北京新岸线网络技术有限公司 一种3d视频图像的视差调节方法及装置
JP5769248B2 (ja) * 2011-09-20 2015-08-26 Necソリューションイノベータ株式会社 ステレオマッチング処理装置、ステレオマッチング処理方法、及び、プログラム
CN102999939B (zh) * 2012-09-21 2016-02-17 魏益群 坐标获取装置、实时三维重建系统和方法、立体交互设备
CN104320647B (zh) * 2014-10-13 2017-10-10 深圳超多维光电子有限公司 立体图像生成方法及显示装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102714751A (zh) * 2010-01-13 2012-10-03 夏普株式会社 在显示器上显示立体图像对以降低观看不适的方法和显示设备
US20120014590A1 (en) * 2010-06-25 2012-01-19 Qualcomm Incorporated Multi-resolution, multi-window disparity estimation in 3d video processing
US20120062548A1 (en) * 2010-09-14 2012-03-15 Sharp Laboratories Of America, Inc. Reducing viewing discomfort
CN102254348A (zh) * 2011-07-25 2011-11-23 北京航空航天大学 一种基于块匹配视差估计的中间视图合成方法
US20150138323A1 (en) * 2011-08-03 2015-05-21 Sony Corporation Image processing device and method, and program
CN104065954A (zh) * 2014-07-03 2014-09-24 中国传媒大学 一种高清立体视频的视差范围快速检测方法
CN105872516A (zh) * 2015-12-28 2016-08-17 乐视致新电子科技(天津)有限公司 获取立体片源视差参数的方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111026641A (zh) * 2019-11-14 2020-04-17 北京云聚智慧科技有限公司 一种图片比较方法和电子设备
CN111026641B (zh) * 2019-11-14 2023-06-20 北京云聚智慧科技有限公司 一种图片比较方法和电子设备

Also Published As

Publication number Publication date
CN105872516A (zh) 2016-08-17

Similar Documents

Publication Publication Date Title
Yeh et al. Haze effect removal from image via haze density estimation in optical model
CN105023249B (zh) 基于光场的高光图像修复方法及装置
WO2020151489A1 (fr) Procédé de détection de corps vivant basé sur une reconnaissance faciale, et dispositif électronique et support de stockage
CN105279372B (zh) 一种确定建筑物高度的方法和装置
WO2017092431A1 (fr) Procédé et dispositif de détection de main humaine basés sur une couleur de peau
CN104756491B (zh) 基于组合的深度提示从单视场图像生成深度图
WO2017113850A1 (fr) Procédé et appareil permettant d'obtenir des paramètres de parallaxe d'une source de film stéréoscopique
US10463257B2 (en) System and method for identifying physical properties of feet
CN108830892B (zh) 人脸图像处理方法、装置、电子设备及计算机可读存储介质
US9881202B2 (en) Providing visual effects for images
WO2018082388A1 (fr) Dispositif et procédé de détection de couleur de peau, et terminal
TWI691937B (zh) 過濾光斑的方法和裝置、電腦可讀取儲存介質、處理器、視線追蹤設備
WO2016206344A1 (fr) Procédé et dispositif de correction d'équilibre des blancs, et support d'informations informatique
US10091490B2 (en) Scan recommendations
WO2021098486A1 (fr) Procédé de traitement de reconnaissance de couleur, appareil, dispositif et support de stockage
CN107622480A (zh) 一种Kinect深度图像增强方法
CN106097261B (zh) 图像处理方法、装置、存储介质及终端设备
CN110111347A (zh) 图像标志提取方法、装置及存储介质
JP6608311B2 (ja) 画像評価装置及び画像評価プログラム
CN109427038A (zh) 一种手机照片显示方法和系统
WO2018036241A1 (fr) Procédé et appareil de classification dans un groupe d'âge
CN112396016A (zh) 一种基于大数据技术的人脸识别系统
KR101513931B1 (ko) 구도의 자동보정 방법 및 이러한 구도의 자동보정 기능이 탑재된 영상 장치
KR101315464B1 (ko) 이미지 처리방법
CN106408617A (zh) 一种基于yuv颜色空间的交互式单幅图像材质获取系统和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16880643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16880643

Country of ref document: EP

Kind code of ref document: A1