WO2016202073A1 - Procédé et appareil de traitement d'image - Google Patents

Procédé et appareil de traitement d'image Download PDF

Info

Publication number
WO2016202073A1
WO2016202073A1 PCT/CN2016/079277 CN2016079277W WO2016202073A1 WO 2016202073 A1 WO2016202073 A1 WO 2016202073A1 CN 2016079277 W CN2016079277 W CN 2016079277W WO 2016202073 A1 WO2016202073 A1 WO 2016202073A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
area
scene
value
exposure
Prior art date
Application number
PCT/CN2016/079277
Other languages
English (en)
Chinese (zh)
Inventor
李振华
李礼
Original Assignee
乐视控股(北京)有限公司
乐视移动智能信息技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视移动智能信息技术(北京)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2016202073A1 publication Critical patent/WO2016202073A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the embodiments of the present application relate to the field of image technologies, and in particular, to a method and an apparatus for image processing.
  • the camera function has become a basic configuration of a terminal, such as a mobile phone, a tablet computer or a smart wearable device, and is no longer limited to use in a camera or a video camera. More and more users choose to use the terminal equipment to take pictures, which brings great fun and convenience to people's life and work.
  • an LED flash is usually set on the terminal device, and the LED flash is used to emit light and illuminate the object, thereby taking a photo with sufficient brightness.
  • the LED flash is used to emit light and illuminate the object, thereby taking a photo with sufficient brightness.
  • the scene level of the scene is relatively complicated, even if the LED flash is used, the scene behind the object to be photographed is still dark (dark), unclear, and the scene cannot be best presented.
  • the embodiment of the present application provides a method and a device for image processing, so as to solve the problem that the image quality is poor at night shooting, and the scene cannot obtain the best presentation.
  • An embodiment of the present application provides a method for image processing, including:
  • the brightness statistics of the plurality of sub-areas include: brightness values, exposure lines, and sensitivities corresponding to the respective sub-areas Correspondence between values;
  • An embodiment of the present application provides an apparatus for image processing, including:
  • the level determining module is configured to determine, according to the brightness statistics of the plurality of sub-areas in the framing area, the scene scene level of each of the sub-areas, wherein the brightness statistics of the plurality of sub-areas include: brightness values corresponding to the respective sub-areas, Correspondence between the number of exposure lines and the sensitivity value;
  • the exposure determining module is configured to determine an exposure number and a current value corresponding to each exposure according to the determined scene scene level of each sub-area;
  • the pre-processing image acquisition module is configured to perform multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, to obtain a plurality of pre-processed images that match the scene scene level;
  • a synthesizing module configured to synthesize the plurality of pre-processed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • the embodiment of the present application also provides a computer readable recording medium on which a program for executing the above method is recorded.
  • the embodiment of the present invention provides a method and an apparatus for image processing, which can respectively determine a scene scene level of a plurality of sub-areas in a framing area according to the brightness statistical information, and then determine the number of exposures and each time according to the determined scene scene level of the plurality of sub-areas.
  • the current value corresponding to the exposure is subjected to multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, and multiple pre-processed pictures matching the scene scene level are obtained, and finally the scene scenes according to the plurality of sub-areas are obtained.
  • multiple exposure shots can be performed according to the scene level of the scene in each sub-area in the entire framing area, so that each sub-area in the entire framing area obtains the required exposure value.
  • Appropriate brightness finally extracting the picture of the best brightness value corresponding to each sub-area from the pre-processed picture obtained by each exposure shooting, and obtaining the target picture, ensuring the image of each part in the captured target picture. It is clear that the clarity and resolution of the entire target image are improved, making the entire target image more vivid and more enjoyable.
  • Embodiment 1 is a flow chart showing the steps of a method for image processing in Embodiment 1 of the present application;
  • FIG. 2 is a flow chart showing the steps of a method for image processing in Embodiment 2 of the present application;
  • FIG. 3 is a flow chart showing the steps of a shooting process in Embodiment 3 of the present application.
  • FIG. 4 is a schematic diagram of a framing area division in Embodiment 3 of the present application.
  • FIG. 5 is a structural block diagram of an apparatus for image processing in Embodiment 4 of the present application.
  • FIG. 6 is a structural block diagram of an apparatus for image processing according to Embodiment 5 of the present application.
  • the method for image processing includes:
  • Step 102 Determine a scene scene level of each sub-area according to brightness statistical information of the plurality of sub-areas in the framing area.
  • the framing area may be determined according to the boundary condition of the scene in the field of view of the lens (camera), and then the framing area is divided into a plurality of sub-areas according to a certain standard. It should be apparent to those skilled in the art that the framing area can be divided in any suitable manner. For example, according to the principle of division, the framing area is equally divided into 3x3 sub-areas, or 4x4 sub-areas, or 9x9 sub-areas, and the like. For another example, the framing area may be divided into a plurality of sub-areas by using a preset plurality of horizontal and vertical coordinate values (Xi, Yj) as an origin, wherein i ⁇ 1, j ⁇ 1.
  • Xi, Yj horizontal and vertical coordinate values
  • the brightness statistics of the plurality of sub-areas include, but are not limited to, a correspondence between a brightness value (luma), an exposure line number (linecnt), and a sensitivity value (gain) corresponding to each sub-area. .
  • the brightness information of a photo can be represented by a brightness value, for example, the greater the brightness value, the photo The brighter the film, the darker the photo.
  • the brightness value can be adjusted by the number of exposure lines and/or the sensitivity.
  • the brightness value is proportional to the number of exposure lines and the sensitivity value, that is, the number of exposure lines and/or the sensitivity value increases. Then the brightness value increases.
  • the change curve (trend, amplitude) between the brightness value and the exposure line number and/or the sensitivity value may indirectly reflect the scene scene level, for example:
  • the brightness value changes immediately with the change of the sensitivity value (for example, the sensitivity value is increased by 1 time, the brightness value is also increased by 1 time), that is, the brightness value varies with the sensitivity value. Larger (strongly changing, the slope of the curve of brightness and sensitivity is large), it can be determined that the current sub-area is the place that can be illuminated by the flash, that is, the close-up.
  • the brightness value does not change substantially when the sensitivity value changes in the current sub-area (the slope of the curve of brightness and sensitivity is basically 0), it can be determined that the current sub-area is not illuminated by the flash, that is, Vision.
  • the brightness value changes slowly with the change of the sensitivity value in the current sub-area (for example, the sensitivity value is increased by 1 time, the brightness value is only increased by 0.2 times), that is, the brightness value changes with the sensitivity value.
  • Small smooth change, small slope of brightness and sensitivity curve
  • the curve (trend, amplitude) between the brightness value and the number of exposure lines can also indirectly reflect the scene scene level, for example, if the brightness value of the current sub-area changes greatly with the number of exposure lines, then The current sub-area is a close-up; if the brightness value of the current sub-area does not change with the change of the number of exposure lines (or the change is small), the current sub-area is a distant view; if the brightness value of the current sub-area changes with the number of exposure lines Slowly changing, the current sub-area is medium.
  • the scene scene level of each sub-area can be separately determined according to the brightness statistical information of the plurality of sub-areas in the framing area.
  • the scene scene level is not limited to the above-mentioned scenes of the far, middle, and near levels.
  • the appropriate scene scene level can be determined according to the preset scene scene level determination rule and the brightness statistical information of each sub-area.
  • Step 104 Determine the number of exposures and the current value corresponding to each exposure according to the determined scene scene level of each sub-area.
  • different current values may be selected for exposure. For example, if it is determined that the plurality of sub-regions in the framing area correspond to three scene scene levels (far, medium, and near) respectively, at least three exposure processes may be performed, and: the corresponding selected current value may be a vista with the first exposure. The first current value adapted to the level; the corresponding current value selected in the second exposure may be a second current value adapted to the middle level; the corresponding selected current value in the third exposure may be adapted to the close-range level The third current value.
  • Step 106 Perform multiple exposures according to the determined number of exposures and the current value corresponding to each exposure to obtain a plurality of pre-processed pictures that match the scene scene level.
  • Step 108 Synthesize the plurality of pre-processed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • an image processing method may separately determine a scene scene level of each sub-area according to brightness statistical information of a plurality of sub-areas in the framing area, and then according to the determined sub-areas
  • the scene scene level determines the number of exposures and the current value corresponding to each exposure, and performs multiple exposures according to the determined number of exposures and the current value corresponding to each exposure to obtain a plurality of preprocessed pictures matching the scene scene level.
  • the plurality of pre-processed pictures are synthesized according to the scene scene level of the plurality of sub-areas to obtain a target picture.
  • multiple exposure shots can be performed according to the scene level of the scene in each sub-area in the entire framing area, so that each sub-area in the entire framing area obtains the required exposure value.
  • Appropriate brightness finally extracting the picture of the best brightness value corresponding to each sub-area from the pre-processed picture obtained by each exposure shooting, and obtaining the target picture, ensuring the image of each part in the captured target picture. It is clear that the clarity and resolution of the entire target image are improved, making the entire target image more vivid and more enjoyable.
  • the method of image processing may be implemented by, but not limited to, by a mobile terminal, for example, by a setting module/component provided in the mobile terminal, or by setting in the mobile terminal.
  • the application is implemented, or alternatively, it can be realized by a camera/camera such as a camera.
  • the method for image processing may include:
  • Step 202 The mobile terminal separately calculates brightness statistics according to multiple sub-areas in the framing area. Determine the scene scene hierarchy for each sub-area.
  • the brightness statistics of the plurality of sub-areas include, but are not limited to, a correspondence between a brightness value, an exposure line number, and a sensitivity value corresponding to each sub-area.
  • the step 202 may include the following sub-steps:
  • Sub-step 2022 the mobile terminal determines the ratio of the brightness value and the exposure line number and/or the sensitivity value under each sub-area according to the correspondence between the brightness value, the exposure line number and the sensitivity value under each sub-area. .
  • k1 and/or k2 corresponding to each sub-area are not completely the same.
  • k1 and k2 corresponding to sub-area 1 may be 10, and k1 corresponding to sub-area 2 may be 0.5, k2 may Is 0.3.
  • k1 can be used to represent the ratio result between the luminance value and the number of exposure lines;
  • k2 can be used to represent the ratio result between the luminance value and the sensitivity value.
  • the result of the ratio between the brightness value and the number of exposure lines, and/or the ratio between the brightness value and the sensitivity value may be used to determine a scene scene level of the sub-area, specifically referring to the following sub-steps:
  • Sub-step 2024 the mobile terminal compares the ratio result with a preset scene scene hierarchy criterion to determine a scene scene level of each sub-area.
  • the ratio between the luminance value and the sensitivity value when the ratio (k2) between the luminance value and the sensitivity value is 10, it can be determined that the scene scene level of the current sub-region is the medium scene, when When the ratio (k2) between the brightness value and the sensitivity value is 1, it can be determined that the scene scene level of the current sub-area is the medium scene, and when the ratio (k2) between the brightness value and the sensitivity value is 0.1, it can be determined.
  • the scene scene level of the current sub-area is a distant view.
  • the correspondence between the above ratio and the scene scene level is merely exemplary and should not be construed as limiting the application.
  • the scene scene level can also be determined according to the ratio between the brightness value and the number of exposure lines, and details are not described herein again.
  • Step 204 The mobile terminal determines the number of exposures and the current value corresponding to each exposure according to the determined scene scene level of each sub-area.
  • the framing area is divided into a plurality of sub-areas, for example, the framing area is divided into ten sub-areas; further, according to the brightness statistics of the ten sub-areas, the scene scene level of the ten sub-areas can be determined as follows : The scene scene level of the sub-area numbered 01-03 is the close-up scene, the scene scene level of the sub-area numbered 04-08 is the medium scene, and the scene scene level of the sub-area numbered 09-10 is the distant view.
  • the exposure current selected by the first exposure may be a first current matched to the close-up
  • the exposure current selected by the second exposure may be a second current matching the medium scene
  • the exposure current selected by the third exposure may be It is the third current that matches the vision.
  • Step 206 The mobile terminal performs multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, to obtain a plurality of preprocessed pictures that match the scene scene level.
  • the first current, the second current, and the third current may be used to perform three exposures, respectively, to sequentially obtain a pre-processed picture matching the close-up shot using the first current for exposure shooting, and using the second current for exposure shooting.
  • Step 208 The mobile terminal synthesizes the plurality of pre-processed pictures according to the scene scene level of the plurality of sub-areas to obtain a target picture.
  • the foregoing step 208 may include the following sub-steps:
  • Sub-step 2802 the mobile terminal selects the matched pre-processed picture according to the scene level corresponding to each sub-area, and extracts the sub-picture corresponding to each sub-area from the matched pre-processed picture.
  • the 01 sub-area is taken as an example for description.
  • One feasible way may be as follows: First, select the 01 sub-area (close-up view) Corresponding pre-process picture 1 (pre-processed picture matching the close-up shot using the first current for exposure shooting); then, determining the position of the 01 sub-area in the pre-processed picture 1 (01 in the pre-processed picture) 'Sub-region>; Finally, the picture of the 01' sub-area in the pre-process picture 1 is intercepted as the sub-picture corresponding to the 01 sub-area.
  • the sub-pictures corresponding to the respective sub-areas can be obtained sequentially (or asynchronously) in the above manner.
  • the 01-03 sub-area needs to be extracted from the corresponding pre-process picture 1
  • the sub-picture and the 04-08 sub-area need to be extracted from the corresponding pre-processed picture 2 to obtain the corresponding sub-picture
  • the 09-10 sub-area needs to extract the corresponding sub-picture from the corresponding pre-processed picture 3.
  • Sub-step 2084 the mobile terminal synthesizes the sub-pictures corresponding to each of the extracted sub-regions to obtain the target picture.
  • the sub-pictures 01-03 corresponding to the 01-03 sub-areas may be sequentially extracted from the pre-processed picture 1, and the sub-pictures 04- corresponding to the 04-08 sub-areas are sequentially extracted from the pre-processed picture 2. 08.
  • the sub-pictures 09-10 corresponding to the 09-10 sub-areas are sequentially extracted from the pre-processed picture 3; then, the sub-pictures 01-10 are combined in the order of the ten sub-areas described above to obtain the target picture.
  • the brightness statistics of the plurality of sub-areas in the framing area may be determined by the following steps:
  • the mobile terminal adjusts the sensitivity value and/or the number of exposure lines once according to the first setting rule.
  • One skilled in the art can use any suitable rule to adjust the sensitivity value and/or the number of exposure lines.
  • the sensitivity value and/or the number of exposure lines can be adjusted by one of the following possible ways:
  • the mobile terminal sequentially increases the magnitude of the sensitivity value according to the first set step size; and, when the sensitivity value is equal to the maximum threshold, sequentially increases the number of exposure lines according to the second set step size the size of;
  • the mobile terminal sequentially decreases the magnitude of the sensitivity value according to the third set step size; and, when the sensitivity value is equal to the minimum threshold, sequentially decreases the exposure according to the fourth set step size The size of the number of rows.
  • the first setting step, the second setting step, the third setting step, and the fourth setting step may select any appropriate value according to actual conditions, and the first The set step size, the second set step size, the third set step size, and the fourth set step size may be the same, or may be freely selected according to actual conditions.
  • the first set step can be used as the first set step with a 0.1 times reference value (a preset sensitivity value or a corresponding sensitivity value at the pre-flash). Adjustment of the degree.
  • the manner of adjusting with the reference value may be as follows:
  • the mobile terminal adjusts the sensitivity value and/or the number of exposure lines at least once by using the first sensitivity value obtained in advance and/or the determined first exposure line number as the adjustment initial value.
  • the first sensitivity value may be an adjustment initial value (reference value) corresponding to gain
  • the first exposure line number may be an adjustment initial value (reference value) corresponding to the linecnt.
  • the adjustment of the sensitivity value and/or the number of exposure lines is performed by using the first sensitivity value and/or the first exposure line number as an adjustment initial value.
  • a feasible manner of obtaining the first sensitivity value and/or the first exposure line number may be as follows: when the flash terminal is pre-flashed, the mobile terminal records that the pre-flash value satisfies the preset. The sensitivity value and/or the number of exposure lines corresponding to the standard brightness value; and determining the corresponding sensitivity value and/or the number of exposure lines when the pre-brightness value satisfies the preset standard brightness value as the number A sensitivity value and/or a number of first exposure lines.
  • the mobile terminal separately records the adjustment results of each sub-area after each adjustment, and obtains multiple statistical results.
  • each of the statistical results of each sub-area includes: a sensitivity value, an exposure line number, and a brightness value corresponding to each sub-area after each adjustment.
  • the mobile terminal determines the brightness statistics of each sub-area according to the average of the plurality of statistical results corresponding to each sub-area.
  • each sub-area corresponds to one statistical result for each adjustment. For example, if three adjustments are made, the 01-10 sub-regions each have three statistical results.
  • the average of the plurality of statistical results corresponding to each sub-region may be used as the luminance statistical information corresponding to each sub-region.
  • the method may further include the following step 210.
  • the step 210 may be performed, but not limited to, before the step 202 above.
  • Step 210 The mobile terminal divides the framing area determined by the lens selection into a plurality of sub-areas according to a preset division criterion.
  • an image processing method may separately determine a scene scene level of each sub-area according to brightness statistical information of a plurality of sub-areas in the framing area, Then, according to the determined scene scene level of each sub-area, the number of exposures and the current value corresponding to each exposure are determined, and multiple exposures are performed according to the determined number of exposures and the current value corresponding to each exposure, and the scene level is obtained. Matching a plurality of pre-processed pictures, and finally synthesizing the plurality of pre-processed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • multiple exposure shots can be performed according to the scene level of the scene in each sub-area in the entire framing area, so that each sub-area in the entire framing area obtains the required exposure value.
  • Appropriate brightness finally extracting the picture of the best brightness value corresponding to each sub-area from the pre-processed picture obtained by each exposure shooting, and obtaining the target picture, ensuring the image of each part in the captured target picture. It is clear that the clarity and resolution of the entire target image are improved, making the entire target image more vivid and more enjoyable.
  • the method for image processing is described in detail by taking a specific processing flow of one shooting process as an example.
  • FIG. 3 a flow chart of steps in a shooting process in Embodiment 3 of the present application is shown.
  • the steps of the shooting process may be as follows:
  • step 302 the framing area in the lens is divided into 4 ⁇ 4 sub-areas.
  • the viewfinder area may be divided into 4x4 sub-areas: sub-areas 001-016.
  • step 304 the LED flash is called to perform pre-flash, and the exposure measurement is performed to determine the sensitivity value and the number of exposure lines when the pre-flash value satisfies the preset standard brightness value during the pre-flash process.
  • the preset standard brightness value may be a preset standard brightness value (eg, luma1) set in advance, and the current brightness value may be achieved by modifying the sensitivity value (gain) and the exposure line number (linecnt).
  • the sensitivity value and the number of exposure lines can be modified by one of the following possible ways:
  • the adjustment of the luminance value and the number of exposure lines can be performed based on the sensitivity value and the number of exposure lines recorded in a preset standard table index table.
  • Table 1 the implementation of the present application is shown.
  • the values as shown in Table 1 can be preferentially combined (gain and linecnt recorded in Table 1) to stabilize the brightness (i.e., the current brightness value and satisfy the preset standard brightness value).
  • the standard brightness value is 100
  • the corresponding sensitivity value and the number of exposure lines are correspondingly selected, for example, the corresponding sensitivity value is 50
  • the number of exposure lines is 1000, that is, the following may be Correspondence relationship ⁇ 100, 50, 1000 ⁇ .
  • the numerical values in Table 1 and the size (100) of the standard luminance values and the determined correspondence ⁇ 100, 50, 1000 ⁇ are merely illustrative.
  • the sensitivity values and the number of exposure lines can be modified by using the values given in the table index table to improve the efficiency and quickly adjust the brightness values to the preset standard brightness values.
  • the modification of the sensitivity value and the number of exposure lines is not limited to the value given in the table index table, and any appropriate value may be selected according to the actual situation to adjust the sensitivity value and the number of exposure lines.
  • Step 306 Determine a scene scene level of each sub-area according to brightness statistical information of the plurality of sub-areas in the framing area.
  • the correspondence between the luminance value and the sensitivity value and the number of exposure lines is: ⁇ 100, 50, 1000 ⁇ .
  • the sensitivity value and the number of exposure lines can be adjusted with the sensitivity value 50 and the exposure line number 1000 recorded in the correspondence relationship ⁇ 100, 50, 1000 ⁇ as initial values.
  • the sensitivity value can be adjusted by using 0.1 times the initial value (50) as the adjustment step size.
  • the adjustment can be made according to the following increment/decrement rule: ... 0.8x50 ⁇ 0.9x50 ⁇ 50 ⁇ 1.1x50 ⁇ 1.2 X50.... Record the correspondence between the sensitivity value and the brightness value (corresponding value) during each adjustment.
  • the size of the exposure line can be further adjusted, and the correspondence between the brightness value and the sensitivity value and the number of exposure lines in each adjustment process is recorded. (corresponding to the value).
  • the sensitivity value and the number of exposure lines can also be adjusted at the same time, which is not limited in this embodiment.
  • the flow of determining the brightness statistical information of the sub-area 001 and the process of determining the scene scene level of the sub-area 001 according to the determined brightness statistical information are described as an example, and the sensitivity value and/or the number of exposure lines are multiple times.
  • the brightness changes in the sub-area 001 after each adjustment are recorded separately.
  • the brightness change after each adjustment can be as follows:
  • the number of exposure lines (linecnt) is unchanged, that is, in the embodiment, the brightness value is changed by adjusting the sensitivity value, and therefore, This can temporarily ignore the change in the number of exposure lines (linecnt).
  • the ratio between the luminance value and the sensitivity value is sequentially: 10 times, 9 times, 11 times, and 10 times; preferably, four ratio results can be taken ( The average value (10 times) of 10 times, 9 times, 11 times, and 10 times) is determined as the brightness statistical information of the sub-area 001.
  • the brightness statistical information of the sub-area 001 that can be determined is compared with a preset scene scene division standard, and the scene scene level of the sub-area 001 is determined to be the medium scene (only an example description, which may be based on the actual object scene level)
  • the division criterion determines the scene scene level of the sub-area 001).
  • the process of determining the scene scene level of other sub-areas can refer to the determination process of the scene scene level of the above sub-area 001, and will not be described one by one.
  • the final scene level of the scene can be as follows: close-up: sub-area 005-008; medium-level: sub-area 001-004; distant view: sub-area 009-012; super-distant: sub-area 013-016.
  • Step 308 Determine the number of exposures and the current value corresponding to each exposure according to the determined scene scene level of each sub-area.
  • a total of four scene scene levels are determined, then, preferably, at least four exposures can be selected: the first exposure can correspondingly select a current value of 0.9I that matches the close scene ( I is the standard current value), the second exposure can correspond to the current value of 1.0I matched with the medium scene, and the third exposure can correspond to the current value of 1.5I matched with the foreground.
  • the fourth exposure can correspond to Select the current value of 2.0I that matches the super-distant view.
  • the value of the above current is only an exemplary description, and specifically, the current value may be selected according to actual conditions.
  • Step 310 Perform multiple exposures according to the determined number of exposures and the current value corresponding to each exposure to obtain a plurality of preprocessed pictures that match the scene scene level.
  • pre-processed picture 1 close-up picture
  • pre-processed picture 2 middle view picture
  • pre-processed picture 3 disant picture
  • pre-processed picture 4 super-vision picture
  • Step 312 Synthesize the plurality of preprocessed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • the sub-pictures 005-008 corresponding to the sub-areas 005-008 (close-up area) positions can be extracted from the pre-processed picture 1 (close-up picture);
  • the sub-pictures 001 to 004 corresponding to the positions of the sub-areas 001 to 004 (middle-area area) are extracted; and the positions of the sub-areas 009 to 012 (distant areas) are extracted from the pre-processed picture 3 (distant picture)
  • Sub-pictures 009 to 012; sub-pictures 013 to 016 corresponding to the positions of the sub-areas 013 to 016 (super-long-view areas) are extracted from the pre-processed picture 4 (super-far picture).
  • the extracted sub-pictures 001 to 016 are combined to obtain a composite picture, and the synthesized picture is used as a final target picture.
  • an image processing method may separately determine a scene scene level of a plurality of sub-areas in the framing area according to the brightness statistical information, and then determine according to the determined scene scene level of the plurality of sub-areas.
  • the number of exposures and the current value corresponding to each exposure are subjected to multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, and multiple pre-processed pictures matching the scene scene level are obtained, and finally
  • the scene scene level of each sub-area synthesizes the plurality of pre-processed pictures to obtain a target picture.
  • multiple exposures may be performed according to the scene level of the scene in each sub-area in the entire framing area.
  • Shooting so that each sub-area in the entire framing area obtains the required exposure value to achieve the appropriate brightness, and finally extracts the optimal brightness value corresponding to each sub-area from the pre-processed image obtained by each exposure shot.
  • the image is synthesized to obtain the target image, which ensures that the images of the various parts in the captured target image are clear, which improves the clarity and resolution of the entire target image, making the whole target image more vivid and more enjoyable. value.
  • the apparatus for image processing includes:
  • the level determining module 502 is configured to determine brightness level information of the plurality of sub-areas in the framing area to determine the scene scene level of each sub-area.
  • the brightness statistical information of the plurality of sub-areas includes: a correspondence relationship between a brightness value, an exposure line number, and a sensitivity value corresponding to each sub-area.
  • the exposure determination module 504 is configured to determine the number of exposures and the current value corresponding to each exposure according to the determined scene scene level of each sub-area.
  • the pre-processing picture obtaining module 506 is configured to perform multiple exposures according to the determined number of exposures and the current value corresponding to each exposure to obtain a plurality of pre-processed pictures that match the scene scene level.
  • the synthesizing module 508 is configured to synthesize the plurality of pre-processed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • an image processing method may separately determine a scene scene level of a plurality of sub-areas in the framing area according to the brightness statistical information, and then determine according to the determined scene scene level of the plurality of sub-areas.
  • the number of exposures and the current value corresponding to each exposure are subjected to multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, and the scene is obtained.
  • the plurality of pre-processed pictures matched by the scene level are finally combined with the plurality of pre-processed pictures according to the scene scene level of the plurality of sub-areas to obtain the target picture.
  • multiple exposure shots can be performed according to the scene level of the scene in each sub-area in the entire framing area, so that each sub-area in the entire framing area obtains the required exposure value.
  • Appropriate brightness finally extracting the picture of the best brightness value corresponding to each sub-area from the pre-processed picture obtained by each exposure shooting, and obtaining the target picture, ensuring the image of each part in the captured target picture. It is clear that the clarity and resolution of the entire target image are improved, making the entire target image more vivid and more enjoyable.
  • the apparatus for image processing may be, but is not limited to, being disposed in a mobile terminal.
  • the apparatus for image processing may also be disposed in a device for taking a picture, such as a camera, a video camera, or the like.
  • the device for image processing may include:
  • the level determining module 602 is configured to determine a scene scene level of each sub-area according to brightness statistics of the plurality of sub-areas in the framing area.
  • the brightness statistical information of the plurality of sub-areas includes: a correspondence relationship between the brightness value, the number of exposure lines, and the sensitivity value corresponding to each sub-area.
  • the brightness statistics of the plurality of sub-areas within the framing area may be determined by a module that is configured to adjust the sensitivity value and/or the number of exposure lines at least once according to the first setting rule.
  • the statistical result obtaining module is configured to perform statistical recording on the adjustment result of each sub-area for each time, and obtain a plurality of statistical results, wherein each statistical result of each sub-area includes: corresponding to each sub-area after each adjustment is completed Sensitivity value, number of exposure lines, and brightness value.
  • the statistical information determining module is configured to determine brightness statistical information of each sub-area according to an average value of the plurality of statistical results corresponding to each sub-area.
  • the adjusting module may further include: a first adjusting module configured to sequentially increase the sensitivity value according to the first setting step when the sensitivity value is less than or equal to the maximum threshold ; and, when the sensitivity value is equal to the maximum threshold, the number of exposure lines is sequentially increased according to the second set step.
  • the second adjusting module is configured to sequentially reduce the magnitude of the sensitivity value according to the third setting step when the sensitivity value is greater than or equal to the minimum threshold; and, in the sensitivity value, etc. At the minimum threshold, the number of exposure lines is sequentially decreased according to the fourth set step.
  • the adjustment module may be configured to use a pre-processed first sensitivity value and/or a determined first exposure line as an adjustment initial value, and a sensitivity value and / or the number of exposure lines is adjusted at least once.
  • the first sensitivity value and/or the first exposure line number can be obtained by the following module: the recording module is configured to record, when the flash pre-flashes, the corresponding when the pre-flash value meets the preset standard brightness value Sensitivity value and / or number of exposure lines.
  • the determining module is configured to determine the recorded sensitivity value and/or the number of exposure lines when the pre-brightness value meets the preset standard brightness value as the first sensitivity value and/or the first exposure line number.
  • the level determining module 602 may specifically include:
  • the ratio determining module 6022 is configured to determine the brightness value and the exposure line number and/or the sensitivity value in each sub-area according to the correspondence between the brightness value, the exposure line number, and the sensitivity value under each sub-area. The ratio result.
  • the comparison module 6024 is configured to compare the ratio result with a preset scene scene hierarchy criterion to determine a scene scene level of each sub-area.
  • the exposure determination module 604 is configured to determine the number of exposures and the current value corresponding to each exposure according to the determined scene scene level of each sub-area.
  • the pre-processing picture obtaining module 606 is configured to perform multiple exposures according to the determined number of exposures and the current value corresponding to each exposure to obtain a plurality of pre-processed pictures that match the scene scene level.
  • the synthesizing module 608 is configured to synthesize the plurality of pre-processed pictures according to a scene scene level of the plurality of sub-areas to obtain a target picture.
  • the synthesizing module may specifically include:
  • the extraction module 6082 is configured to select a matched pre-processed picture according to a scene level corresponding to each sub-area, and extract a sub-picture corresponding to each sub-area from the matched pre-processed picture.
  • the synthesizing module 6084 is configured to synthesize the sub-pictures corresponding to each of the extracted sub-regions to obtain the target picture.
  • the apparatus for image processing may further include: a sub-area dividing module, and dividing the framing area determined by the lens selection into multiple according to a preset dividing criterion Sub-area.
  • the sub-area partitioning module may be, but is not limited to, being executed before the level determining module 602.
  • an image processing method may separately determine a scene scene level of a plurality of sub-areas in the framing area according to the brightness statistical information, and then determine according to the determined scene scene level of the plurality of sub-areas.
  • the number of exposures and the current value corresponding to each exposure are subjected to multiple exposures according to the determined number of exposures and the current value corresponding to each exposure, and multiple pre-processed pictures matching the scene scene level are obtained, and finally
  • the scene scene level of each sub-area synthesizes the plurality of pre-processed pictures to obtain a target picture.
  • multiple exposure shots can be performed according to the scene level of the scene in each sub-area in the entire framing area, so that each sub-area in the entire framing area obtains the required exposure value.
  • Appropriate brightness finally extracting the picture of the best brightness value corresponding to each sub-area from the pre-processed picture obtained by each exposure shooting, and obtaining the target picture, ensuring the image of each part in the captured target picture. It is clear that the clarity and resolution of the entire target image are improved, making the entire target image more vivid and more enjoyable.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without deliberate labor.
  • the present application also provides a computer readable recording medium on which a program for executing the above method is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé et un appareil de traitement d'image. Le procédé comprend les étapes consistant à : déterminer respectivement, d'après des statistiques de luminosité de la pluralité de sous-zones dans une zone cadrée, une hiérarchie objet/scène des sous-zones, les statistiques de luminosité de la pluralité de sous-zones comprenant une correspondance entre une valeur de luminosité, le nombre de lignes exposées, et une valeur de sensibilité correspondant à chaque sous-zone ; déterminer le nombre d'expositions et une valeur actuelle correspondant à chaque exposition d'après la hiérarchie objet/scène des sous-zones ; exécuter une pluralité d'expositions d'après le nombre déterminé d'expositions et la valeur actuelle déterminée correspondant à chaque exposition afin d'obtenir une pluralité d'images prétraitées correspondant à la hiérarchie objet/scène ; et synthétiser la pluralité d'images prétraitées d'après la hiérarchie objet/scène de la pluralité de sous-zones afin d'obtenir une image cible. Les modes de réalisation de la présente invention améliorent la définition et la résolution de l'image cible tout entière et la rendent ainsi plus vivante et plus agréable.
PCT/CN2016/079277 2015-06-19 2016-04-14 Procédé et appareil de traitement d'image WO2016202073A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510345309.9A CN105991939A (zh) 2015-06-19 2015-06-19 图像处理的方法和装置
CN2015103453099 2015-06-19

Publications (1)

Publication Number Publication Date
WO2016202073A1 true WO2016202073A1 (fr) 2016-12-22

Family

ID=57040246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/079277 WO2016202073A1 (fr) 2015-06-19 2016-04-14 Procédé et appareil de traitement d'image

Country Status (2)

Country Link
CN (1) CN105991939A (fr)
WO (1) WO2016202073A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312001A (zh) * 2019-07-30 2021-02-02 北京百度网讯科技有限公司 一种图像检测的方法、装置、设备和计算机存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107872625B (zh) * 2017-12-25 2020-05-22 信利光电股份有限公司 一种双摄像头曝光同步控制方法和系统
CN109525783A (zh) * 2018-12-25 2019-03-26 努比亚技术有限公司 一种曝光拍摄方法、终端及计算机可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909150A (zh) * 2009-06-03 2010-12-08 索尼公司 成像设备和成像控制方法
US20100328482A1 (en) * 2009-06-26 2010-12-30 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method
CN102469248A (zh) * 2010-11-12 2012-05-23 华晶科技股份有限公司 影像拍摄装置及其影像合成方法
CN102917180A (zh) * 2011-08-05 2013-02-06 佳能企业股份有限公司 影像撷取方法及影像撷取装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
CN100480832C (zh) * 2005-08-19 2009-04-22 仁宝电脑工业股份有限公司 利用闪光灯预闪辅助对焦的方法
CN102316261B (zh) * 2010-07-02 2015-05-13 华晶科技股份有限公司 数字相机的感光度的调整方法
CN104104886B (zh) * 2014-07-24 2016-07-06 努比亚技术有限公司 过曝拍摄方法及装置
CN104092954B (zh) * 2014-07-25 2018-09-04 北京智谷睿拓技术服务有限公司 闪光控制方法及控制装置、图像采集方法及采集装置
CN107454343B (zh) * 2014-11-28 2019-08-02 Oppo广东移动通信有限公司 拍照方法、拍照装置及终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909150A (zh) * 2009-06-03 2010-12-08 索尼公司 成像设备和成像控制方法
US20100328482A1 (en) * 2009-06-26 2010-12-30 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method
CN102469248A (zh) * 2010-11-12 2012-05-23 华晶科技股份有限公司 影像拍摄装置及其影像合成方法
CN102917180A (zh) * 2011-08-05 2013-02-06 佳能企业股份有限公司 影像撷取方法及影像撷取装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312001A (zh) * 2019-07-30 2021-02-02 北京百度网讯科技有限公司 一种图像检测的方法、装置、设备和计算机存储介质

Also Published As

Publication number Publication date
CN105991939A (zh) 2016-10-05

Similar Documents

Publication Publication Date Title
CN106331508B (zh) 拍摄构图的方法及装置
JP6066536B2 (ja) ゴーストのない高ダイナミックレンジ画像の生成
TWI549503B (zh) 電子裝置、自動效果方法以及非暫態電腦可讀取媒體
CN111327824B (zh) 拍摄参数的选择方法、装置、存储介质及电子设备
CN106550184B (zh) 照片处理方法及装置
CN103763477B (zh) 一种双摄像头拍后调焦成像装置和方法
EP3306913B1 (fr) Procédé et appareil de photographie
US9836831B1 (en) Simulating long-exposure images
JP6401324B2 (ja) ダイナミックフォトの撮影方法及び装置
JP2016092618A (ja) 撮像装置およびその制御方法
CN103971547B (zh) 基于移动终端的摄影仿真教学方法及系统
CN105025215A (zh) 一种终端基于多摄像头实现合照的方法及装置
TWI505233B (zh) 影像處理方法及影像處理裝置
JP2014155001A (ja) 画像処理装置及び画像処理方法
WO2017036273A1 (fr) Appareil et procédé d'imagerie
JP2018006912A (ja) 撮像装置、画像処理装置及びそれらの制御方法、プログラム
WO2016202073A1 (fr) Procédé et appareil de traitement d'image
KR20110109574A (ko) 이미지 처리 방법 및 이를 이용한 촬영 장치
WO2018166170A1 (fr) Procédé et dispositif de traitement d'image, et terminal de conférence intelligent
WO2021051304A1 (fr) Réglage de vitesse d'obturateur et procédés d'étalonnage d'obturateur sécurisé, dispositif portable et véhicule aérien sans pilote
US11871123B2 (en) High dynamic range image synthesis method and electronic device
CN106254790A (zh) 拍照处理方法及装置
KR101094648B1 (ko) 구도결정을 하는 사진사 로봇 및 그 제어방법
US20230033956A1 (en) Estimating depth based on iris size
CN106878606B (zh) 一种基于电子设备的图像生成方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16810814

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16810814

Country of ref document: EP

Kind code of ref document: A1