WO2020034702A1 - Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur - Google Patents

Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur Download PDF

Info

Publication number
WO2020034702A1
WO2020034702A1 PCT/CN2019/088244 CN2019088244W WO2020034702A1 WO 2020034702 A1 WO2020034702 A1 WO 2020034702A1 CN 2019088244 W CN2019088244 W CN 2019088244W WO 2020034702 A1 WO2020034702 A1 WO 2020034702A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
exposure
pixels
pixel unit
imaging
Prior art date
Application number
PCT/CN2019/088244
Other languages
English (en)
Chinese (zh)
Inventor
张弓
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020034702A1 publication Critical patent/WO2020034702A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present disclosure relates to the field of imaging technology, and in particular, to a control method, an apparatus, an electronic device, and a computer-readable storage medium.
  • the imaging device in the prior art uses a pixel unit array of a fixed structure for imaging.
  • the present disclosure provides a control method, device, electronic device, and computer-readable storage medium for automatically adjusting the arrangement of short exposure pixels, middle exposure pixels, and / or long exposure pixels in a pixel unit array according to the brightness level of a shooting environment. Position, so that imaging can be performed through the output pixel values of at least two exposed pixels in adjacent arrangements, which can retain more effective information in the captured image, improve the brightness of the captured image, and then improve the imaging effect and imaging quality, and improve the user ’s
  • the shooting experience is to solve the problem that when the hardware structure of the pixels in the imaging device is determined in the prior art, it cannot be changed, and it is difficult to adapt to a variety of different shooting scenes, resulting in the imaging quality of currently captured images being limited by the imaging devices in electronic devices. Technical issues that cannot improve image quality.
  • An embodiment of one aspect of the present disclosure provides a control method applied to an imaging device, where the imaging device includes a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure pixel, wherein The exposure duration of the long exposure pixel is greater than the exposure duration of the middle exposure pixel, and the exposure duration of the middle exposure pixel is greater than the exposure duration of the short exposure pixel.
  • the method includes the following steps:
  • the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large;
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement positions of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array so that at least two The exposure pixels are arranged adjacently, and at least two of the adjacently arranged exposure pixels are used as the first pixel unit;
  • Imaging is performed according to output pixel values of at least two middle-exposed pixels in the first pixel unit.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • the imaging device includes a pixel unit array composed of multiple exposure pixels. Each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel. Wherein, the exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, and the device includes:
  • a determining module for determining a brightness level of ambient brightness includes a low brightness level, a medium brightness level, and a high brightness level in which brightness is arranged from small to large;
  • An adjustment module for adjusting the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array if the brightness level of the shooting environment belongs to a high brightness level or a low brightness level; Arranging at least two middle exposure pixels adjacent to each other, and using at least two middle exposure pixels adjacent to each other as a first pixel unit;
  • the imaging module is configured to perform imaging according to the output pixel values of at least two of the exposed pixels in the first pixel unit.
  • the control device of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it adjusts the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array.
  • the middle so that at least two middle exposure pixels are arranged next to each other, with at least two middle exposure pixels arranged next to each other as the first pixel unit, and then according to at least two middle exposure pixels in the first pixel unit Output pixel values for imaging. Therefore, the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array can be automatically adjusted according to the brightness level of the shooting environment, so as to output through at least two middle exposure pixels arranged adjacently.
  • Pixel values for imaging can retain more effective information in the captured image, improve the brightness of the captured image, thereby improving the imaging effect and imaging quality, and improving the user's shooting experience.
  • An embodiment of another aspect of the present disclosure provides an electronic device including an imaging device, the imaging device including a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure pixel, wherein The exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, the electronic device further includes a memory, a processor, and a storage A computer program on a memory and executable on a processor that, when the processor executes the program, implements a control method as proposed by the foregoing embodiments of the present disclosure.
  • An embodiment of another aspect of the present disclosure provides a non-transitory computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented.
  • FIG. 1 is a schematic flowchart of a control method according to Embodiment 1 of the present disclosure
  • FIG. 2 is a first schematic structural diagram of a portion of a pixel unit array of an imaging device in an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a grayscale histogram corresponding to a backlight scene in an embodiment of the present disclosure
  • FIG. 4 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure.
  • FIG. 5 is a second schematic structural diagram of a portion of a pixel unit array of an imaging device in an embodiment of the present disclosure
  • FIG. 6 is a schematic flowchart of a control method according to a third embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of a photosensitive pixel unit in an embodiment of the present disclosure.
  • FIG. 8 is a schematic flowchart of a control method according to a fourth embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a control device according to a fifth embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a control device according to a sixth embodiment of the present disclosure.
  • FIG. 11 is a schematic block diagram of an electronic device according to some embodiments of the present disclosure.
  • FIG. 12 is a schematic block diagram of an image processing circuit according to some embodiments of the present disclosure.
  • the present disclosure aims at a technical problem of poor image quality in the prior art, and provides a control method.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • FIG. 1 is a schematic flowchart of a control method according to a first embodiment of the present disclosure.
  • the control method of the embodiment of the present disclosure is applied to an imaging device.
  • the imaging device includes a pixel unit array composed of multiple exposure pixels, and each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel.
  • the long exposure pixel refers to the exposure time corresponding to the photosensitive pixel is the long exposure time
  • the medium exposure pixel refers to the exposure time corresponding to the photosensitive pixel is the middle exposure time
  • the short exposure pixel refers to the short exposure time corresponding to the photosensitive pixel.
  • Exposure time, long exposure time> medium exposure time> short exposure time that is, the long exposure time of the long exposure pixel is greater than the middle exposure time of the middle exposure pixel, and the middle exposure time of the middle exposure pixel is greater than the short exposure time of the short exposure pixel.
  • the long exposure pixels, the middle exposure pixels, and the short exposure pixels are simultaneously exposed.
  • the synchronous exposure means that the exposure times of the middle exposure pixels and the short exposure pixels are within the exposure time of the long exposure pixels.
  • the long-exposure pixel can be controlled to start exposure first.
  • the exposure of the middle-exposure pixel and the short-exposure pixel can be controlled.
  • the exposure cut-off time of the middle-exposure pixel and the short-exposure pixel should be longer than the long-exposure pixel.
  • the exposure cutoff time of the exposure pixel is the same or before the exposure cutoff time of the long exposure pixel.
  • the long exposure pixel, the middle exposure pixel, and the short exposure pixel can be controlled to start exposure at the same time, that is, the exposure start time of the long exposure pixel, the middle exposure pixel, and the short exposure pixel is the same. In this way, there is no need to control the pixel unit array to perform long exposure, medium exposure, and short exposure in order, which can reduce the shooting time of the image.
  • control method includes the following steps:
  • Step 101 Determine a brightness level of ambient brightness; the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large.
  • the ambient brightness may be divided into three brightness levels in advance, which are a low brightness level, a medium brightness level, and a high brightness level.
  • the brightness level may be set in advance by a built-in program of an electronic device, or, Set by the user, there is no restriction on this.
  • an independent light measuring device may be used to measure the ambient brightness, or the ISO value of the sensitivity automatically adjusted by the camera may be read, and the ambient brightness may be determined according to the read ISO value, or it may be controlled.
  • the pixel unit array measures the ambient brightness value to determine the ambient brightness, which is not limited. After determining the ambient brightness, the brightness level can be determined based on the ambient brightness.
  • the above ISO value is used to indicate the sensitivity of the camera.
  • Commonly used ISO values are 50, 100, 200, 400, 1000, etc.
  • the camera can automatically adjust the ISO value according to the ambient brightness. Therefore, in this embodiment, According to the ISO value, the ambient brightness can be deduced.
  • the ISO value can be 50 or 100 in the case of sufficient light, and the ISO value can be 400 or higher in the case of insufficient light.
  • the brightness level can be set differently.
  • the ISO value is between 200 and 500, and during the day, the electronic device When the device is outdoors, the ISO value is generally lower than 200. Therefore, the size and interval of each brightness level can be set according to actual needs and specific shooting scenes.
  • Step 102 if the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array so that at least two middle exposures
  • the pixels are arranged adjacently, and at least two of the exposed pixels in the adjacent arrangement are used as the first pixel unit.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it indicates that the shooting environment is brighter or darker. At this time, the environment brightness is extremely extreme.
  • the noise of the output image is high, or the pixel value output by the long-exposure pixel or the short-exposure pixel may overflow, and the image details are lost seriously.
  • the red, green, and blue (RGB) three-color histogram the detailed content distribution of three shades with different lightness and darkness can be displayed intuitively, and the overexposed parts will gather at the left and right ends of the histogram, which is tolerated.
  • the area outside the degree will not be brighter, but all details will be lost, and only the full white color block (255,255,255) will be displayed.
  • the underexposed part will not be darker than the tolerance level, but will lose all details and only display All black patches (0,0,0).
  • the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted so that at least two middle exposure pixels are arranged next to each other. Cloth, using at least two exposed pixels in adjacent arrangements as the first pixel unit.
  • the positions of each of the middle-exposed pixels and adjacent short-exposure pixels or long-exposure pixels can be exchanged, so that at least two middle-exposure pixels are arranged adjacently, and then At least two middle-exposed pixels arranged adjacently serve as a first pixel unit.
  • FIG. 2 is a first schematic structural diagram of a part of a pixel unit array of an imaging device in an embodiment of the present disclosure.
  • the original pixel unit array includes 16 exposure pixels, which are 4 long exposure pixels (L), 8 middle exposure pixels (M), and 4 short exposure pixels (S).
  • L long exposure pixels
  • M middle exposure pixels
  • S short exposure pixels
  • the middle exposure pixel (M) can be exchanged with an adjacent short exposure pixel (S)
  • the middle exposure pixel (M) can be replaced with an adjacent long exposure pixel.
  • the exposure pixels (L) are swapped, so that eight adjacently arranged middle exposure pixels (M) included in the region 21 can be used as the first pixel unit, or two, four, or six adjacently arranged
  • Each of the exposed pixels (M) serves as a first pixel unit.
  • the number of medium exposure pixels (M) contained in each first pixel unit may be adjusted according to the brightness of the shooting environment as a possible implementation. The lower the brightness, the lower the medium contained in each first pixel unit. The greater the number of exposed pixels (M). Normally, each of the first pixel units can be fixed with 4 medium exposure pixels (M), which can meet the needs of most shooting scenes.
  • Step 103 Perform imaging according to output pixel values of at least two exposed pixels in the first pixel unit.
  • imaging can be performed according to at least two of the exposed pixels in the first pixel unit.
  • eight adjacently arranged middle exposure pixels in the area 21 can be used to output pixel values for imaging. It can be understood that imaging is performed by outputting pixel values of the medium-exposure pixels to avoid pixel value overflow, thereby retaining more image details and improving the imaging effect and imaging quality.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • the arrangement positions of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array are automatically adjusted, so that at least two The exposure pixels output pixel values for imaging, which can retain more effective information in the captured image and improve the brightness of the captured image.
  • the histogram of the captured preview image it is determined that the current shooting environment belongs to a backlit scene.
  • a grayscale histogram may be generated according to the grayscale value corresponding to the environmental brightness value measured by the pixel unit array, and then the current shooting is determined according to the proportion of the number of photosensitive pixels in each grayscale range. Whether the environment is a backlit scene.
  • the ratio grayRatio to all photosensitive pixels in the pixel unit array is greater than
  • the first threshold for example, the first threshold may be 0.135, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256) of the photosensitive pixels, and the ratio of grayRatio to all photosensitive pixels in the pixel unit array is greater than the second threshold
  • the second threshold may be 0.0899, it is determined that the current shooting environment is a backlit scene.
  • the ratio of grayRatio to all light-sensitive pixels in the pixel unit array is greater than The third threshold, for example, the third threshold may be 0.3, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256).
  • the ratio of grayRatio to all the photosensitive pixels in the pixel unit array is greater than the fourth threshold. For example, when the fourth threshold may be 0.003, it is determined that the current shooting environment is a backlit scene.
  • the ratio of grayRatio to all light-sensitive pixels in the pixel unit array is greater than The fifth threshold, for example, the fifth threshold may be 0.005, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256).
  • the ratio of grayRatio to all the photosensitive pixels in the pixel unit array is greater than the sixth threshold. For example, when the sixth threshold may be 0.25, it is determined that the current shooting environment is a backlit scene.
  • a grayscale histogram corresponding to a backlight scene may be shown in FIG. 3.
  • the ambient brightness value measured by each photosensitive pixel in the pixel unit array will have a high brightness difference. Therefore, as another possible implementation, It can also determine the brightness value of the imaging object and the background brightness value according to the ambient brightness value measured by the pixel unit array, and determine whether the difference between the brightness value of the imaging object and the background brightness value is greater than a preset threshold.
  • the current shooting environment is a backlit scene, and the difference between the brightness value of the imaging object and the brightness value of the background is less than or equal to the preset
  • the threshold value is determined, the current shooting environment is a non-backlit scene.
  • the preset threshold may be preset in a built-in program of the electronic device, or the preset threshold may be set by a user, which is not limited.
  • An imaging object is an object that needs to be photographed by an electronic device, such as a person (or a human face), an animal, an object, a scene, or the like.
  • step 102 may specifically include the following sub-steps:
  • Step 201 Control output pixel values of at least two exposed pixels in the first pixel unit.
  • each of the middle exposure pixels in the first pixel unit can be controlled to be exposed synchronously, and the exposure time of each of the middle exposure pixels is the same, that is, the exposure cut-off time of each of the middle exposure pixels is also the same.
  • the exposed pixels in each of the first pixel units will output corresponding pixel values. For example, referring to FIG. 2, the first pixel unit will output 8 pixel values.
  • Step 202 Generate a first composite pixel value according to the pixel value output by the same first pixel unit.
  • the pixel unit array includes multiple exposure pixels, and the arrangement positions of the short exposure pixels, medium exposure pixels, and / or long exposure pixels in the pixel unit array are adjusted so that at least two middle exposure pixels are adjacent to each other. After the arrangement, there may be a plurality of first pixel units.
  • the pixel unit array may be composed of a plurality of photosensitive pixel units, and each photosensitive pixel unit includes at least two exposure pixels, and the at least two exposure pixels include at least one exposure pixel.
  • FIG. 5 is a second schematic diagram of a partial structure of a pixel unit array of an imaging device in an embodiment of the present disclosure.
  • the imaging device 30 includes a pixel unit array 31 and a filter unit array 32 provided on the pixel unit array 31.
  • the pixel unit array 31 includes a plurality of photosensitive pixel units 311, and each photosensitive pixel unit 311 includes at least two exposure pixels 3111, and at least two of the exposure pixels include at least one of the exposure pixels.
  • each photosensitive pixel unit 311 includes four exposure pixels 3111.
  • the four exposure pixels may be one long exposure pixel, two medium exposure pixels, and one short exposure pixel.
  • the number of long-exposure pixels, middle-exposure pixels, and short-exposure pixels in each photosensitive pixel unit 311 may also be other values, which is not limited.
  • the filter unit array 32 includes a plurality of filter units 322 corresponding to the plurality of photosensitive pixel units 311, and each filter unit 322 covers the corresponding photosensitive pixel unit 311.
  • the pixel unit array 31 may be a Bayer array.
  • each exposure pixel included in the same photosensitive pixel unit is covered by the same color filter, and at least two exposure pixels belonging to the same first pixel unit are covered with the same color filter, that is, at least two of the exposure pixels belong to the same Photosensitive pixel unit. That is, in the present disclosure, the middle exposure pixels in the same first pixel unit are located in the same photosensitive pixel unit, that is, each photosensitive pixel unit in the pixel unit array has a first pixel unit.
  • the sum of the pixel values output by all the middle-exposed pixels included in the first pixel unit may be used as the first synthesized pixel value.
  • an average value of pixel values output by all the middle-exposed pixels included in the first pixel unit may be used, and the average value is used as the first composite pixel value.
  • Step 203 Perform imaging according to the first synthesized pixel value.
  • the first composite pixel value corresponding to each first pixel unit in the pixel array can be obtained, and then multiple The first synthesized pixel value is calculated by interpolation.
  • the arrangement positions of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted so that the middle exposure pixels are arranged at intervals, and then the short exposure pixels adjacent to the middle exposure pixels are arranged.
  • medium-exposure pixels are used for imaging.
  • long-exposure pixels can correct the dark areas in the image
  • short-exposure pixels can correct the light areas in the image, improving the imaging effect and imaging quality.
  • FIG. 6 is a schematic flowchart of a control method provided in Embodiment 3 of the present disclosure.
  • control method may further include the following steps:
  • Step 301 if the brightness of the shooting environment is equal to the medium brightness level, adjust the arrangement position of the short exposure pixels, the medium exposure pixels, and / or the long exposure pixels in the pixel unit array so that the middle exposure pixels are arranged at intervals. At least two exposure pixels covered by the same color filter and short exposure pixels and / or long exposure pixels adjacent to the at least two exposure pixels covered by the same color filter are used as the second pixel unit.
  • the arrangement positions of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted, so that the middle exposure pixels are arranged at intervals.
  • each of the exposed pixels may be swapped with an adjacent short-exposed pixel and / or a long-exposed pixel, so that the middle-exposed pixels are arranged at intervals, and at least two of the same-color filters are covered.
  • the second pixel unit is an exposure pixel and a short exposure pixel or a long exposure pixel adjacent to at least two exposure pixels covered by the same color filter.
  • At least two exposure pixels covered by the same color filter and short exposure pixels adjacent to the at least two exposure pixels covered by the same color filter may be used as the second pixel unit, or the same color unit may be used.
  • the at least two exposure pixels covered by the filter and the long exposure pixels adjacent to the at least two exposure pixels covered by the same-color filter are used as the second pixel unit, or the at least two exposure pixels covered by the same-color filter,
  • a short exposure pixel adjacent to at least two exposure pixels covered by the same color filter and a long exposure pixel adjacent to at least two exposure pixels covered by the same color filter are used as the second pixel unit. Since each exposure pixel contained in the same photosensitive pixel unit is covered by the same color filter, each photosensitive pixel unit in the pixel unit array has a second pixel unit.
  • one photosensitive pixel unit includes 4 exposure pixels, which are 1 long exposure pixel (L), 2 middle exposure pixels (M), and 1 short exposure pixel (S).
  • the middle exposure pixel (M) may be exchanged with an adjacent short exposure pixel (S), or the middle exposure pixel (M) and an adjacent long exposure pixel ( L)
  • the positions are swapped so that the adjacent two exposed pixels (M) are arranged at intervals.
  • two middle exposure pixels (M) and one short exposure pixel (S) in the photosensitive pixel unit can be used as the second pixel unit, or two middle exposure pixels (M) and one long exposure pixel in the photosensitive pixel unit can be used.
  • the exposure pixel (L) is used as the second pixel unit, or two middle exposure pixels (M), one long exposure pixel (L), and one short exposure pixel (S) in the photosensitive pixel unit may be used as the second pixel unit.
  • Pixel unit Pixel unit.
  • Step 302 Perform imaging according to the pixel value output by each exposed pixel in the second pixel unit.
  • imaging can be performed according to the pixel value output by each exposure pixel in the second pixel unit. Therefore, the long exposure pixels can correct the dark areas in the image, or the short exposure pixels can correct the light areas in the image, improving the imaging effect and imaging quality.
  • step 302 may specifically include the following sub-steps:
  • Step 401 Control the output pixel value of each exposed pixel in the second pixel unit.
  • the output pixel value of each exposed pixel in the second pixel unit may be controlled.
  • the second pixel unit may be controlled to output multiple pixel values at different exposure times, for example, the middle exposure pixel and long exposure pixel in the second pixel unit may be controlled to be exposed synchronously, or, Control the synchronous exposure of the middle and short exposure pixels in the second pixel unit, or control the synchronous exposure of the middle, long, and short exposure pixels in the second pixel unit, where the exposure time corresponding to the long exposure pixel is Initial long exposure time.
  • the exposure time corresponding to the medium exposure pixel is the initial medium exposure time.
  • the exposure time corresponding to the short exposure pixel is the initial short exposure time.
  • the initial long exposure time, the initial medium exposure time, and the initial short exposure time are all preset. Ok. After the exposure is over, the second pixel unit will output multiple pixel values at different exposure times.
  • the second pixel unit may also be controlled to output a plurality of pixel values obtained through exposure at the same exposure time.
  • the middle exposure pixel and the long exposure pixel in the second pixel unit can be controlled to be exposed synchronously, or the middle exposure pixel and the short exposure pixel in the second pixel unit can be controlled to be simultaneously exposed, or the second pixel unit can be controlled to be exposed synchronously.
  • the middle exposure pixel, the long exposure pixel, and the short exposure pixel are simultaneously exposed.
  • the exposure time of each exposure pixel is the same, that is, the exposure cut-off time of the middle exposure pixel, the long exposure pixel, and the short exposure pixel is also the same.
  • the second pixel unit will output a plurality of pixel values obtained by exposure with the same exposure time.
  • Step 402 Generate a second synthesized pixel value according to the pixel value output by the same second pixel unit.
  • each photosensitive pixel unit in the pixel unit array has a second pixel unit
  • a second composite pixel value can be generated according to a pixel value output by the same second pixel unit.
  • the second pixel unit after controlling the second pixel unit to output multiple pixel values respectively under different exposure times, for the same second pixel unit, it can be calculated based on the pixel values in the second pixel unit that have the same exposure time. Get the second synthesized pixel value.
  • each second pixel unit when each second pixel unit includes one long exposure pixel and two middle exposure pixels, the pixel value output by the only long exposure pixel is the second composite pixel value of the long exposure, and two The sum of the pixel values output by the middle exposure pixels is the second composite pixel value of the middle exposure; when each second pixel unit includes 2 middle exposure pixels and 1 short exposure pixel, the pixels output by the 2 middle exposure pixels The sum of the values is the second composite pixel value for medium exposure, and the pixel value output by one short exposure pixel is the second composite pixel value for short exposure. In this way, the second composite pixel value of multiple exposures, the second composite pixel value of multiple long exposures, and the second composite pixel value of multiple short exposures can be obtained for the entire pixel unit array.
  • multiple pixel values output by the second pixel unit may be calculated An average value is obtained to obtain a second synthesized pixel value, wherein each second pixel unit corresponds to a second synthesized pixel value.
  • each second pixel unit includes one long-exposure pixel and two middle-exposure pixels
  • the value of one pixel output of one long-exposure pixel and the two-pixel value of two middle-exposure pixels output are respectively : R1, R2, and R3,
  • the second composite pixel value of the second pixel unit is: (R1 + R2 + R3) / 3.
  • Step 403 Perform imaging according to the second synthesized pixel value.
  • imaging can be performed according to the second synthesized pixel value.
  • the second pixel unit when the second pixel unit is controlled to output multiple pixel values respectively at different exposure times, and for the same second pixel unit, it is calculated based on the pixel values in the second pixel unit that have the same exposure time.
  • multiple middle-exposed second composite pixel values and multiple long-exposure second composite pixel values and / or multiple short-exposure second composite pixel values can be obtained for the entire pixel unit array, and then , And then calculate a mid-exposure sub-image based on a plurality of intermediate-exposed second composite pixel values, and calculate a long-exposure sub-image based on a plurality of long-exposed second composite pixel values, and / or a plurality of short-exposed second
  • the short-exposure sub-image is calculated by interpolating the composite pixel values.
  • the middle exposure sub image and the long exposure sub image and / or the short exposure sub image are processed by fusion to obtain a high dynamic range imaging image.
  • the long exposure sub image, the middle exposure sub image, and the short exposure sub image are not in the traditional sense.
  • the three frames of images are image parts formed by corresponding regions of long, short, and medium exposure pixels in the same frame of image.
  • the pixel value of the long exposure pixel and / or the pixel value of the short exposure pixel may be superimposed on the pixel value of the long exposure pixel based on the pixel value of the long exposure pixel.
  • different weights may be assigned to pixel values at different exposure times. After the pixel values corresponding to each exposure time are multiplied by the weight values, the pixel values after the multiplication are weighted. Add a second composite pixel value as a second pixel unit. Subsequently, since the gray level of each second composite pixel value calculated according to the pixel values of different exposure times will change, it is necessary to gray each second composite pixel value after obtaining the second composite pixel value.
  • an imaging image can be obtained by performing interpolation calculation based on a plurality of second synthesized pixel values obtained after compression.
  • the dark part of the imaged image has been compensated by the pixel value output by the long exposure pixel, or the bright part has been suppressed by the pixel value output by the short exposure pixel, so there is no over-exposed area or under-exposed area in the imaged image, which has a higher Dynamic range and better imaging results.
  • a plurality of second synthesized pixel values in the entire pixel unit array may be determined, and then interpolation calculation may be performed according to the plurality of second synthesized pixel values to obtain an imaging image.
  • the present disclosure also proposes a control device.
  • FIG. 9 is a schematic structural diagram of a control device according to a fifth embodiment of the present disclosure.
  • the control device 100 is applied to an imaging device.
  • the imaging device includes a pixel unit array composed of multiple exposure pixels. Each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel. Among them, the long exposure pixel
  • the control device 100 includes a determination module 101, an adjustment module 102, and an imaging module 103.
  • the exposure duration of the exposure time is greater than the exposure duration of the middle exposure pixels and the exposure duration of the middle exposure pixels is greater than the exposure duration of the short exposure pixels. among them,
  • the determining module 101 is configured to determine a brightness level of ambient brightness; the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large.
  • the adjustment module 102 is configured to adjust the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array when the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, so that at least two The middle exposure pixels are arranged adjacently, and at least two middle exposure pixels arranged adjacently are used as the first pixel unit.
  • the imaging module 103 is configured to perform imaging according to output pixel values of at least two exposed pixels in the first pixel unit.
  • the imaging module 103 is specifically configured to: control output pixel values of at least two exposed pixels in the first pixel unit; and generate a first composite pixel value according to the pixel value output by the same first pixel unit; Imaging is performed according to the first synthesized pixel value.
  • the adjustment module 102 is further configured to: after determining the brightness level of the ambient brightness, if the brightness of the shooting environment is equal to the medium brightness level, adjust the short exposure pixels, the medium exposure pixels, and / or the long exposure
  • the pixels are arranged in the pixel unit array such that the exposure pixels are arranged at intervals. At least two exposure pixels covered by the same color filter and short adjacent pixels of the at least two exposure pixels covered by the same color filter are arranged.
  • the exposure pixels and / or long exposure pixels serve as the second pixel unit.
  • the imaging module 103 is further configured to perform imaging according to a pixel value output by each exposed pixel in the second pixel unit.
  • the imaging module 103 is specifically configured to: control the output pixel value of each exposure pixel in the second pixel unit; generate a second composite pixel value according to the pixel value output by the same second pixel unit; and according to the second The pixel values are synthesized for imaging.
  • control device 100 may further include:
  • the first backlight scene determination module 104 is configured to determine, before determining the brightness level of the ambient brightness, according to the histogram of the captured preview image, that the current shooting environment belongs to the backlight scene.
  • the second backlight scene determination module 105 is configured to determine the brightness value of the imaging object and the background brightness value according to the environment brightness value measured by the pixel unit array before determining the brightness level of the environment brightness, and according to the The brightness value of the imaging object and the brightness value of the background determine that the current shooting environment belongs to a backlit scene.
  • At least two exposure pixels belonging to the same first pixel unit are covered with the same color filter.
  • the adjustment module 102 is specifically configured to: in the pixel unit array, exchange positions between each of the exposed pixels and an adjacent short exposure pixel or long exposure pixel.
  • the control device of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it adjusts the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array.
  • the middle so that at least two middle exposure pixels are arranged next to each other, with at least two middle exposure pixels arranged next to each other as the first pixel unit, and then according to at least two middle exposure pixels in the first pixel unit Output pixel values for imaging. Therefore, the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array can be automatically adjusted according to the brightness level of the shooting environment, so as to output through at least two middle exposure pixels arranged adjacently.
  • Pixel values for imaging can retain more effective information in the captured image, improve the brightness of the captured image, thereby improving the imaging effect and imaging quality, and improving the user's shooting experience.
  • the present disclosure also proposes an electronic device including the imaging device, the imaging device including a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure A pixel, wherein the exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, the electronic device further includes a memory, a processing unit, And a computer program stored on a memory and executable on a processor, and when the processor executes the program, the control method proposed by the foregoing embodiment of the present disclosure is implemented.
  • the present disclosure also proposes a non-transitory computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented .
  • the present disclosure further provides an electronic device 200.
  • the electronic device 200 includes a memory 50 and a processor 60.
  • the memory 50 stores computer-readable instructions.
  • the processor 60 is caused to execute the control method of any one of the foregoing embodiments.
  • FIG. 11 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method of the embodiment of the present disclosure.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display.
  • the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball, or a touch button provided on the housing of the electronic device 200.
  • Board which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • FIG. 11 is only a schematic diagram of a part of the structure related to the solution of the present disclosure, and does not constitute a limitation on the electronic device 200 to which the solution of the present disclosure is applied.
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of ISP (Image Signal Processing) pipelines. Processing unit.
  • FIG. 12 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 12, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present disclosure are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
  • control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps for implementing the control method using the processor 60 in FIG. 11 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 12:
  • the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large;
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array so that at least two Middle exposure pixels are arranged adjacently, and at least two middle exposure pixels arranged adjacently are used as a first pixel unit;
  • Imaging is performed according to output pixel values of at least two middle-exposed pixels in the first pixel unit.
  • Imaging is performed according to the first synthesized pixel value.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of the present disclosure includes additional implementations in which functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present disclosure belong.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un procédé de commande, un dispositif, un dispositif électronique et un support d'informations lisible par ordinateur, le procédé étant appliqué à un dispositif d'imagerie, le dispositif d'imagerie comprenant un réseau d'unités de pixels composé d'une pluralité de pixels d'exposition, chaque pixel d'exposition étant un pixel d'exposition courte, un pixel d'exposition moyenne ou un pixel d'exposition longue, et le procédé consistant : à déterminer un niveau de luminosité d'une luminosité de l'environnement ; si le niveau de luminosité de l'environnement de photographie est un niveau de luminosité élevé ou un niveau de luminosité faible, à ajuster les positions d'agencement des pixels d'exposition courte, des pixels d'exposition moyenne et/ou des pixels d'exposition longue dans le réseau d'unités de pixels de telle sorte qu'au moins deux pixels d'exposition moyenne sont agencés de manière adjacente l'un par rapport à l'autre, et à utiliser lesdits au moins deux pixels d'exposition moyenne agencés de manière adjacente en tant que première unité de pixels ; à effectuer une imagerie en fonction de valeurs de pixels de sortie des au moins deux pixels d'exposition moyenne dans la première unité de pixels. Le présent procédé peut conserver des informations plus efficaces dans une image capturée, améliorer la luminosité de l'image capturée, et améliorer ainsi l'effet d'imagerie et la qualité d'imagerie.
PCT/CN2019/088244 2018-08-13 2019-05-24 Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur WO2020034702A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810915440.8 2018-08-13
CN201810915440.8A CN108965729A (zh) 2018-08-13 2018-08-13 控制方法、装置、电子设备和计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020034702A1 true WO2020034702A1 (fr) 2020-02-20

Family

ID=64469406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/088244 WO2020034702A1 (fr) 2018-08-13 2019-05-24 Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN108965729A (fr)
WO (1) WO2020034702A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113676635A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN114697537A (zh) * 2020-12-31 2022-07-01 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质
CN109788207B (zh) * 2019-01-30 2021-03-23 Oppo广东移动通信有限公司 图像合成方法、装置、电子设备及可读存储介质
CN110176039A (zh) * 2019-04-23 2019-08-27 苏宁易购集团股份有限公司 一种针对人脸识别的摄像机调校方法和系统
CN116847202B (zh) * 2023-09-01 2023-12-05 深圳市广和通无线通信软件有限公司 基于四色滤波阵列算法的曝光调整方法及设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1134565A1 (fr) * 2000-03-13 2001-09-19 CSEM Centre Suisse d'Electronique et de Microtechnique SA Pyromètre formateur d'images
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN105578075A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN108200354A (zh) * 2018-03-06 2018-06-22 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN108353140A (zh) * 2015-11-16 2018-07-31 微软技术许可有限责任公司 图像传感器系统
CN108353134A (zh) * 2015-10-30 2018-07-31 三星电子株式会社 使用多重曝光传感器的拍摄装置及其拍摄方法
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1134565A1 (fr) * 2000-03-13 2001-09-19 CSEM Centre Suisse d'Electronique et de Microtechnique SA Pyromètre formateur d'images
CN108353134A (zh) * 2015-10-30 2018-07-31 三星电子株式会社 使用多重曝光传感器的拍摄装置及其拍摄方法
CN108353140A (zh) * 2015-11-16 2018-07-31 微软技术许可有限责任公司 图像传感器系统
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN105578075A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN108200354A (zh) * 2018-03-06 2018-06-22 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114697537A (zh) * 2020-12-31 2022-07-01 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质
CN114697537B (zh) * 2020-12-31 2024-05-10 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质
CN113676635A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN113676635B (zh) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN108965729A (zh) 2018-12-07

Similar Documents

Publication Publication Date Title
WO2020034737A1 (fr) Procédé de commande d'imagerie, appareil, dispositif électronique et support d'informations lisible par ordinateur
CN108322669B (zh) 图像获取方法及装置、成像装置和可读存储介质
CN109005364B (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
KR102376901B1 (ko) 이미징 제어 방법 및 이미징 디바이스
US10630906B2 (en) Imaging control method, electronic device and computer readable storage medium
EP3609177B1 (fr) Procédé de commande, appareil de commande, dispositif d'imagerie et dispositif électronique
CN109040609B (zh) 曝光控制方法、装置、电子设备和计算机可读存储介质
WO2020029732A1 (fr) Procédé et appareil de photographie panoramique, et dispositif d'imagerie
WO2020034702A1 (fr) Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur
CN109788207B (zh) 图像合成方法、装置、电子设备及可读存储介质
WO2020057199A1 (fr) Procédé et dispositif d'imagerie, et dispositif électronique
CN108632537B (zh) 控制方法及装置、成像设备、计算机设备及可读存储介质
WO2020034701A1 (fr) Procédé et appareil de commande d'imagerie, dispositif électronique et support de stockage lisible
CN109040607B (zh) 成像控制方法、装置、电子设备和计算机可读存储介质
US11601600B2 (en) Control method and electronic device
CN109005369B (zh) 曝光控制方法、装置、电子设备以及计算机可读存储介质
WO2020029679A1 (fr) Procédé et appareil de commande, dispositif d'imagerie, dispositif électronique et support de stockage lisible
CN108900785A (zh) 曝光控制方法、装置和电子设备
CN108259754B (zh) 图像处理方法及装置、计算机可读存储介质和计算机设备
CN109005363B (zh) 成像控制方法、装置、电子设备以及存储介质
CN110276730B (zh) 图像处理方法、装置、电子设备
WO2017096855A1 (fr) Procédé et appareil permettant de régler de façon dynamique un paramètre gamma

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19849541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19849541

Country of ref document: EP

Kind code of ref document: A1