CN111970439A - Image processing method and device, terminal and readable storage medium - Google Patents

Image processing method and device, terminal and readable storage medium Download PDF

Info

Publication number
CN111970439A
CN111970439A CN202010796963.2A CN202010796963A CN111970439A CN 111970439 A CN111970439 A CN 111970439A CN 202010796963 A CN202010796963 A CN 202010796963A CN 111970439 A CN111970439 A CN 111970439A
Authority
CN
China
Prior art keywords
zoom
image
zooming
area
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010796963.2A
Other languages
Chinese (zh)
Inventor
徐锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010796963.2A priority Critical patent/CN111970439A/en
Publication of CN111970439A publication Critical patent/CN111970439A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators

Abstract

The application provides an image processing method. The image processing method includes acquiring a zoom position to determine a zoom region; selecting pixels of a row corresponding to the zooming area in a pixel array for exposure so as to obtain an original image; and digitally zooming the original image to output a zoomed image. According to the image processing method, the zoom area is determined through the zoom position, only the pixels of the row corresponding to the zoom area are exposed to obtain the original image, compared with the method that the pixels of all the rows of the whole pixel array are exposed to obtain the original image, the data processing amount is small, and the drawing time is shortened. The application also provides an image processing apparatus, a terminal and a non-volatile computer-readable storage medium.

Description

Image processing method and device, terminal and readable storage medium
Technical Field
The present application relates to the field of image technologies, and in particular, to an image processing method, an image processing apparatus, a terminal, and a non-volatile computer-readable storage medium.
Background
In the digital zooming, partial details of an image are cut out from an original image, and then the cut small image is enlarged to obtain an enlarged detail image, and the definition of the image is basically consistent with that of the original image after the digital zooming, so that the original image with higher resolution is generally required to be shot in order to obtain a zoom image with higher definition, however, the data processing amount of the original image with higher resolution is larger when the original image with higher resolution is output, and the drawing time is longer.
Disclosure of Invention
Embodiments of the present application provide an image processing method, an image processing apparatus, a terminal, and a non-volatile computer-readable storage medium.
The image processing method comprises the steps of obtaining a zooming position to determine a zooming area; selecting pixels of a row corresponding to the zooming area in a pixel array for exposure so as to obtain an original image; and digitally zooming the original image to output a zoomed image.
The image processing device of the embodiment of the application comprises an acquisition module, an exposure module and a zooming module. The acquisition module is used for acquiring a zoom position to determine a zoom area; the exposure module is used for selecting pixels of a row corresponding to the zooming area in the pixel array to be exposed so as to obtain an original image; the zooming module is used for carrying out digital zooming on the original image so as to output a zoomed image.
The terminal of the embodiment of the application comprises an image sensor and a processor, wherein the image sensor comprises a pixel array and a row selector, and the processor is used for acquiring a zooming position to determine a zooming area; the row selector is used for selecting pixels of a row corresponding to the zooming area in the pixel array to be exposed so as to obtain an original image; the processor is further configured to digitally zoom the raw image to output a zoomed image.
A non-transitory computer-readable storage medium embodying a computer program that, when executed by one or more processors, causes the processors to perform an image processing method. The image processing method includes acquiring a zoom position to determine a zoom region; selecting pixels of a row corresponding to the zooming area in a pixel array for exposure so as to obtain an original image; and digitally zooming the original image to output a zoomed image.
According to the image processing method, the image processing device, the terminal and the nonvolatile computer readable storage medium, the zoom area is determined according to the zoom position, only the pixels of the row corresponding to the zoom area are exposed to obtain the original image, compared with the method that the pixels of all the rows of the whole pixel array are exposed to obtain the original image, the data processing amount is small, and the drawing time is reduced.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 2 is a block schematic diagram of an image processing apparatus according to some embodiments of the present application;
FIG. 3 is a schematic diagram of a terminal structure according to some embodiments of the present application;
FIG. 4 is a schematic diagram of the structure of an image sensor;
FIG. 5a is a schematic diagram of an arrangement of a pixel array according to some embodiments of the present application;
FIG. 5b is a schematic diagram of an arrangement of minimum repeating units of a pixel array according to some embodiments of the present application;
FIG. 6 is a schematic view of a scene of an image processing method according to some embodiments of the present application;
FIG. 7 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 8 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 9 is a schematic view of a scene of an image processing method according to some embodiments of the present application;
FIG. 10 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 11 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 12 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application; and
FIG. 13 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
Referring to fig. 1, an image processing method according to an embodiment of the present application includes the following steps:
011: acquiring a zoom position to determine a zoom area;
012: selecting pixels 3211 in a row of the pixel array 321 corresponding to the zoom area for exposure to obtain an original image; and
013: the original image is digitally zoomed to output a zoomed image.
Referring to fig. 2, in some embodiments, the image processing apparatus 10 includes an acquisition module 11, an exposure module 12, and a zoom module 13. The acquisition module 11, the exposure module 12 and the zoom module 13 are configured to perform step 011, step 012 and step 013, respectively. That is, the obtaining module 11 is configured to obtain a zoom position to determine a zoom region; the exposure module 12 is configured to select a pixel 3211 in a row of the pixel array 321 corresponding to the zoom area for exposure, so as to obtain an original image; the zoom module 13 is used to digitally zoom the original image to output a zoomed image.
Referring to fig. 3 and 4, in some embodiments, the terminal 100 further includes an image sensor 32 and a processor 20. Image sensor 32 includes a pixel array 321 and a row selector 322, processor 20 is configured to obtain a zoom position to determine a zoom region; the row selector 322 is configured to select a pixel 3211 in a row of the pixel array 321 corresponding to the zoom area for exposure, so as to obtain an original image; the processor 20 is also operative to digitally zoom the original image to output a zoomed image. That is, step 011 and step 013 can be implemented by processor 20, and step 012 can be implemented by row selector 322.
Specifically, the terminal 100 includes a housing 40, a processor 20, and a camera 30. The terminal 100 may be a mobile phone, a tablet computer, a display, a notebook computer, a teller machine, a gate, a smart watch, a head-up display device, a game console, etc. As shown in fig. 3, the embodiment of the present application is described by taking the terminal 100 as an example, and it is understood that the specific form of the terminal 100 is not limited to a mobile phone. The housing 40 may also be used to mount functional modules of the terminal 100, such as an imaging device (i.e., the camera 30), a power supply device, a communication device, etc., so that the housing 40 provides protection for the functional modules against dust, falling, water, etc.
The camera 30 may be a front camera, a rear camera, a side camera, an off-screen camera, etc., without limitation. The camera 30 includes a lens module 31 and an image sensor 32, and the camera 30 and the processor 20 are mounted in a housing 40. The housing 40 is provided with a light hole so that the lens module 31 of the camera 30 is exposed from the light hole, light enters the lens module 31 through the light hole and then enters the image sensor 32, and the image sensor 32 is configured to convert a light signal irradiated onto the image sensor 32 into an electrical signal to generate an original image.
Image sensor 32 includes a pixel array 321 and a row selector 322. The pixel array 321 may be a matrix, a circular array, or the like, and may be determined according to a desired shape of a captured image, and in the embodiment, the pixel array 321 is a matrix. Upon exposure, row selector 322 selects rows of pixel array 321 to achieve either row-by-row exposure or simultaneous exposure.
Referring to fig. 5a and 5b, the pixel array 321 may be formed by arranging a plurality of minimum repeating units (as shown in fig. 5 b), and each minimum repeating unit may be arranged in a bayer array. The minimal repeating unit may also include panchromatic pixels and color pixels, wherein the color pixels have a narrower spectral response than the panchromatic pixels. For example, the response spectrum of a color pixel is a portion of the response spectrum of a panchromatic pixel, and for example, the response spectrum of a color pixel is the spectrum of a certain color of visible light, and the response spectrum of a panchromatic pixel is the spectrum of the entire visible light.
Each minimal repeating unit may have a plurality of color pixels of different colors, and the color pixels in each minimal repeating unit include a color a, a color b, and/or a color c, for example, the color a is red, the color b is green, and the color c is blue, or for example, the color a is magenta, the color b is cyan, and the color c is yellow, etc., without limitation. The spectra of the plurality of differently colored color pixels may not intersect or partially intersect. In other embodiments, the colored pixels in the minimal repeating unit comprise a color a, a color b, or a color c; alternatively, the color pixels in the minimum repeating unit include a color a and a color b; alternatively, the color pixels in the minimal repeating unit include color b and color c; alternatively, the color pixels in the minimum repeating unit include a color a and a color c; for example, color a may be red, color b may be green, and color c may be blue.
The color w in fig. 5a and 5b may refer to the color of a full-color pixel, e.g. white. Because panchromatic pixel has spectral response wider than chromatic color pixel, so panchromatic pixel can receive optical signal more, also can acquire more optical signal under the dark light environment, is favorable to promoting the shooting effect of dark light environment.
Referring to fig. 6, a camera application is generally provided on the terminal 100 (e.g., a mobile phone), or the terminal 100 is provided with a camera start key, and a start request of the camera 30 can be issued by clicking the camera application or pressing the camera start key, and the processor 20 controls the camera 30 to start and display a preview image on the display screen 50 after receiving the start request of the camera 30.
On the preview image interface, according to the received zoom operation of the user, the processor 20 may determine a zoom position, where the zoom position is a touch position of the touch operation, and the zoom position is generally a center of a target area where the user wants to view further details, where the target area is an area needing zooming, that is, a zoom area. For example, as shown in fig. 6, when the user wants to view the appearance of the target person in the preview image interface, the user may first touch the position of the head of the target person, where the touched position is the zoom position, and the processor 20 selects the area frame of the head of the target person with a predetermined selection frame S, where the area corresponding to the predetermined selection frame S is the zoom area. Of course, in other embodiments, the area selected by the user may be intelligently identified, the target type in the area is identified, such as a human face, an animal, a plant, and the like, and then the zoom area is adjusted according to the identified target type, so as to ensure that the target frame can be selected in the zoom area.
After the zoom position is acquired to determine the zoom region, since the pixels in the preview image correspond to the pixels 32113211 of the pixel array 321 one to one, a coordinate system is established with the upper left corner of the preview image and the upper left corner of the pixel array 321 as the origin of coordinates, and the pixels with the same coordinates correspond to one. The processor 20 may determine the pixel array area corresponding to the zoom area according to the position coordinates of the pixels located in the zoom area, thereby determining the row in which the pixel array area is located. For example, coordinates of the pixels in the zoom region in the coordinate system of the preview image include (4,4), (4,5), (4,6), (4,7), (5,4), (5,5), (5,6), (5,7), (6,4), (6,5), (6,6), (6,7), (7,4), (7,5), (7,6), (7,7), and so on, and coordinates of the pixels 3211 in the pixel array region in the coordinate system of the pixel array 321 also include (4,4), (4,5), (4,6), (4,7), (5,4), (5,5), (5,6), (5,7), (6,4), (6,5), (6,6), (6,7), (7,4), (7,5), (7,6), (7,7), and so on the abscissa of the positional coordinates of the pixels 3211 in the pixel array region can be determined that the pixel array region is located in the 4 th row, a, b, c, b, and so on the basis of the abscissa of the positional coordinates of the pixels 3211 in the pixel array region, Line 5, line 6 and line 7.
After determining the rows corresponding to the zoom area (such as the 4 th row, the 5 th row, the 6 th row and the 7 th row in the above example), the row selector 322 controls the row-by-row exposure or the simultaneous exposure of the 4 th row, the 5 th row, the 6 th row and the 7 th row in the pixel array 321, so as to obtain an original image, which may be a raw image, including pixel values of all the exposed pixels 3211 (pixels of the 4 th row, the 5 th row, the 6 th row and the 7 th row).
The processor 20 digitally zooms the original image to output a final zoomed image, since the image output by the image sensor generally has a predetermined resolution (e.g. 400 × 300, 1600 × 900, etc.), and since the image sensor 32 of the present application only outputs the original image including the image of the zoomed region, the original image needs to be digitally zoomed to increase the resolution of the original image to the predetermined resolution, so as to output the zoomed image reaching the predetermined resolution.
The image processing method, the image processing apparatus 10 and the terminal 100 according to the embodiment of the application determine the zoom area by the zoom position, and expose only the pixels 3211 of the line corresponding to the zoom area to obtain the original image, compared with exposing all the pixels 3211 of the line of the entire pixel array 321 to obtain the original image, the data processing amount is smaller, and the drawing time is reduced. Under the condition that the total data processing amount in unit time is not changed, the number of image frames processed in unit time can be increased after the data processing amount required by each frame of image is reduced, the output frame rate of the image sensor 32 is increased, and the realization of the functions of multi-frame noise reduction, slow motion and the like is guaranteed. In addition, the reduction of data processing amount is also beneficial to reducing power consumption.
Referring to FIG. 7, in some embodiments, step 011 includes the steps of:
0111: acquiring a zooming position according to touch operation received by the preview image interface;
0112: and taking the zooming position as a center, and selecting a region with a preset size as a zooming region.
Referring again to fig. 2, in some embodiments, the obtaining module 11 is further configured to perform step 0111 and step 0112. Namely, the obtaining module 11 is further configured to obtain a zoom position according to a touch operation received by the preview image interface; and selecting a region with a predetermined size as a zoom region with the zoom position as a center.
Referring again to fig. 3 and 4, in some embodiments, the processor 20 is further configured to obtain a zoom position according to a touch operation received by the preview image interface; and selecting a region with a predetermined size as a zoom region with the zoom position as a center. That is, step 0111 and step 0112 may be implemented by processor 20.
Specifically, after the preview image interface of the display screen 50 receives a touch operation of the user, the processor 20 may determine the zoom position, where the touch operation may be a click, a finger opening, or the like. When clicking, the clicking position is the zooming position; when the two fingers are opened, the middle point of the connecting line of the positions touched by the two fingers is the zoom position, so that the zoom position can be accurately determined.
Then, a region of a predetermined size is selected as a zoom region with the zoom position as the center, for example, as shown in fig. 6, the zoom position is P, and a rectangular region of a predetermined size (as shown in the selection frame S in fig. 6) is selected as a zoom region with P as the center, where P is the intersection of the diagonals of the rectangular region. It is understood that the predetermined size may be 5 × 5 pixel size, 10 × 10 pixel size, 50 × 50, 160 × 90 pixel size, 400 × 300 pixel size, etc., and may be determined according to the resolution of the display screen 50 of the terminal 100, so long as the zoom area covers the area where the user wants to magnify and view the details. In addition, the aspect ratio of the rectangular area of a predetermined size may be determined according to the display scale of the display screen 50 so that the display scale of the image output from the image sensor 32 matches the display screen 50.
Referring to fig. 8, in some embodiments, step 011 further includes the steps of:
0113: acquiring a zooming position and zooming magnification according to touch operation received by a preview image interface;
0114: determining the size of a zooming area according to the zooming magnification; and
0115: the zoom position is taken as the center position of the zoom area to determine the zoom area.
Referring again to fig. 2, in some embodiments, the obtaining module 11 is further configured to perform step 0113, step 0114, and step 0115. Namely, the obtaining module 11 is further configured to obtain a zoom position and a zoom magnification according to the touch operation received by the preview image interface; determining the size of a zooming area according to the zooming magnification; and using the zoom position as the center position of the zoom area to determine the zoom area.
Referring again to fig. 3 and 4, in some embodiments, the processor 20 is further configured to obtain a zoom position and a zoom magnification according to a touch operation received by the preview image interface; determining the size of a zooming area according to the zooming magnification; and using the zoom position as the center position of the zoom area to determine the zoom area. That is, step 0113, step 0114, and step 0115 may be implemented by processor 20.
Specifically, in the preview image interface, if the user wants to further view details of a certain target area, the user may enlarge the preview image, for example, the user may manually select a zoom magnification to enlarge the target area. For example, as shown in fig. 6 and 9, the preview image interface has a progress bar for adjusting the zoom magnification (the zoom magnification can be made to vary from 0 to 10 times), and the user drags the progress bar to increase the zoom magnification (fig. 6 is the preview image interface when the zoom magnification of the progress bar is 0 times, and fig. 9 is the preview image interface when the zoom magnification of the progress bar is 2 times), so as to achieve the enlargement of the preview image; for another example, the user may change the zoom magnification through a gesture operation, for example, the user opens two fingers on the display screen 50 to enlarge an area corresponding to the touch position (the middle point of the connecting line of the touch positions of the two fingers in the general touch position) in the preview image interface.
The processor 20 can determine not only the zoom position but also the zoom magnification according to the touch operation received by the preview image interface. When the zoom magnification is adjusted through the progress bar, the zoom magnification can be determined according to the position of the progress bar; when the zoom magnification is adjusted through gesture operation, the zoom magnification can be determined according to the opening (closing) speed and distance of the two fingers, for example, the opening (closing) speed and distance of the two fingers are in positive correlation with the zoom magnification.
After the zoom magnification is determined, the size of the zoom area may be determined according to the zoom magnification. It is understood that the larger the zoom magnification, the smaller the proportion of the target area in the preview image that indicates that the user wants to view details, the smaller the zoom area, e.g., the zoom magnification is 2, the zoom area size may be 1/2, 1/4, etc. for the preview image, and the zoom magnification is 4, the zoom area size may be 1/4, 1/8, etc. for the preview image. Of course, the zoom area cannot be too large or too small, and the smaller the zoom area is, the more lines that need to be exposed will be caused, the larger the data processing amount will be, and the smaller the zoom area will be, the image of the area that the user wants to view details will be caused to be possibly not completely included in the zoom area, therefore, in the embodiment of the present application, the size of the zoom area is equal to the size of 1/n of the preview image, where n is the zoom magnification, and the size of the zoom area is suitable, which can both reduce the data processing amount and ensure the image of the area that completely includes details that the user wants to view.
After determining the center position (i.e., zoom position) of the zoom region and the size of the zoom region, the zoom region can be accurately determined in the preview image, and as shown in fig. 9, when the zoom magnification is 2 times, the zoom region (e.g., the selection box S in fig. 9) is 1/2 of the size of the preview image.
Referring to fig. 10, in some embodiments, step 011 further includes the steps of:
0116: acquiring a target zoom area of which the zoom area is positioned in a preview image interface;
step 012 includes the steps of:
0121: the pixels 3211 of the row in the pixel array 321 corresponding to the target zoom region are selected for exposure to acquire an original image.
Referring again to fig. 2, in some embodiments, the obtaining module 11 is further configured to perform step 0116, and the exposing module 12 is further configured to perform step 0121. Namely, the obtaining module 11 is further configured to obtain a target zoom area of the zoom area located in the preview image interface; the exposure module 12 is further configured to select a pixel 3211 in a row of the pixel array 321 corresponding to the target zoom region for exposure to obtain an original image.
Referring again to fig. 3 and 4, in some embodiments, the processor 20 is further configured to obtain a target zoom area where the zoom area is located in the preview image interface; the pixels 3211 of the row in the pixel array 321 corresponding to the target zoom region are selected for exposure to acquire an original image. That is, step 0116 may be implemented by processor 20 and step 0121 may be implemented by row selector 322.
Specifically, in order to make the zoom region be located outside the preview image interface, the zoom region is generally adjusted to the center position of the preview image interface, for example, the user may actively adjust the shooting angle so that the zoom region is located at the center position of the preview image interface, or the processor 20 prompts the user to adjust the shooting angle so that the zoom region is located at the center position of the preview image interface, however, in a special case, when the user directly zooms in a peripheral region located on the preview image interface without adjusting the shooting angle, the zoom region may be located at the peripheral region of the preview image interface, so that a partial region of the determined zoom region may be located outside the preview image interface, since the zoom region located outside the preview image interface has no corresponding pixel 3211 in the pixel array 321, the processor 20 may determine that the zoom region located outside the preview image interface is an invalid region, and determines the zoom area located in the preview image interface as the target zoom area, so as to select the pixels 3211 in the row of the pixel array 321 corresponding to the target zoom area for exposure to acquire the original image.
Referring to fig. 11, in some embodiments, step 012 further includes:
0122: in the exposed pixels 3211, the pixel values of the pixels in the column corresponding to the zoom area are read to obtain an original image.
Referring again to fig. 2, in some embodiments, the exposure module 12 is further configured to perform step 0122. That is, the exposure module 12 is further configured to read pixel values of pixels in a column corresponding to the zoom area in the exposed pixels 3211 to obtain an original image.
Referring to fig. 3 and 4, in some embodiments, the image sensor 32 further includes a readout circuit 323, where the readout circuit 323 is configured to read pixel values of pixels in a column corresponding to the zoom area in the exposed pixel 3211 to obtain an original image. That is, step 0122 may be implemented by processor 20.
Specifically, after the row selector 322 performs line-by-line exposure or simultaneous exposure on the rows corresponding to the zoom area, the obtained pixel data includes the pixel values of all the pixels 3211 in the row corresponding to the zoom area, the processor 20 determines the column in which the zoom area is located according to the ordinate of the position coordinate of the pixel 3211 in the pixel array area corresponding to the zoom area, the readout circuit 323 can read out the pixel values of the pixels 3211 in the pixel array 321 by column, and the readout circuit 323 can read out only the pixel values of the pixels 3211 in the column corresponding to the zoom area, so that it is not necessary to read out all the exposed pixels 3211, and the readout speed is fast. When the readout circuit 323 reads out the pixel values of the pixels 3211 in the column corresponding to the zoom region, the exposed pixels 3211 can acquire the pixel values, but the unexposed pixels 3211 cannot acquire the pixel values, and thus, the pixel values of the pixels 3211 corresponding to the zoom region can be accurately read out, and the original image can be acquired from the pixel values of the pixels 3211 corresponding to the zoom region.
Referring to fig. 12, in some embodiments, step 013 includes the following steps:
0131: the original image is subjected to interpolation processing according to a predetermined resolution to output a zoom image.
Referring again to fig. 2, in some embodiments, the zoom module 13 is further configured to perform step 0131. That is, the zoom module 13 is also configured to interpolate the original image according to a predetermined resolution to output a zoom image.
Referring again to fig. 3 and 4, in some embodiments, the processor 20 is further configured to perform interpolation processing on the original image according to a predetermined resolution to output a zoom image.
Specifically, since the image output by the image sensor generally has a predetermined resolution (e.g., 400 × 300, 1600 × 900, etc.), since the image sensor 32 of the present application only outputs the pixel information including the rows and columns corresponding to the zoom region, and the resolution does not reach the predetermined resolution, it is necessary to digitally zoom the original image to improve the resolution of the original image to the predetermined resolution, so as to output the zoom image reaching the predetermined resolution.
When the original image is digitally zoomed to improve the resolution, a plurality of interpolation pixels need to be inserted, and the interpolation pixels need to be interpolated to determine the pixel value, so that the change between the interpolation pixels and the pixels of the surrounding original image is not too abrupt, and the imaging quality of the zoomed image is improved. For example, the original image has a size of 20 × 20 pixels, and the predetermined resolution is 40 × 40, where 20 pixels are interpolation pixels, and interpolation processing needs to be performed on the interpolation pixels to determine pixel values, for example, the pixel values of the interpolation pixels are calculated according to the pixel values of pixels of a plurality of original images around the interpolation pixels, so as to obtain the zoom image.
Referring to fig. 13, a non-volatile computer readable storage medium 300 storing a computer program 302 according to an embodiment of the present disclosure, when the computer program 302 is executed by one or more processors 200, the processor 200 may execute the image processing method according to any of the above embodiments.
For example, referring to fig. 1, the computer program 302, when executed by the one or more processors 200, causes the processors 200 to perform the steps of:
011: acquiring a zoom position to determine a zoom area;
012: selecting pixels 3211 in a row of the pixel array 321 corresponding to the zoom area for exposure to obtain an original image; and
013: the original image is digitally zoomed to output a zoomed image.
For another example, referring to fig. 7, when the computer program 302 is executed by the one or more processors 200, the processors 200 may further perform the steps of:
0111: acquiring a zooming position according to touch operation received by the preview image interface;
0112: and taking the zooming position as a center, and selecting a region with a preset size as a zooming region.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more program modules for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (14)

1. An image processing method, comprising:
acquiring a zoom position to determine a zoom area;
selecting pixels of a row corresponding to the zooming area in a pixel array for exposure so as to obtain an original image; and
digitally zooming the original image to output a zoomed image.
2. The image processing method according to claim 1, wherein the obtaining of the zoom position to determine the zoom region comprises:
acquiring the zooming position according to touch operation received by a preview image interface; and
and taking the zooming position as a center, and selecting a region with a preset size as the zooming region.
3. The image processing method according to claim 1, wherein the obtaining of the zoom position to determine the zoom region comprises:
acquiring the zoom position and zoom magnification according to touch operation received by a preview image interface;
determining the size of the zooming area according to the zooming magnification; and
and taking the zoom position as the center position of the zoom area to determine the zoom area.
4. The image processing method according to claim 2 or 3, wherein the acquiring a zoom position to determine a zoom region further comprises:
acquiring a target zoom area of the zoom area in the preview image interface;
selecting pixels of a row in the pixel array corresponding to the zoom area for exposure to obtain an original image, including:
and selecting pixels of a row in the pixel array corresponding to the target zooming area for exposure so as to obtain the original image.
5. The image processing method according to any one of claims 1 to 3, wherein said exposing pixels of a row of the pixel array corresponding to the zoom region to obtain the original image comprises:
and reading pixel values of pixels of a column corresponding to the zoom area in the exposed pixels to acquire the original image.
6. The method of claim 1, wherein said digitally zooming the original image to output a zoomed image comprises:
and performing interpolation processing on the original image according to a preset resolution to output the zoom image.
7. An image processing apparatus characterized by comprising:
an acquisition module for acquiring a zoom position to determine a zoom region;
the exposure module is used for selecting pixels of a row corresponding to the zooming area in the pixel array to carry out exposure so as to obtain an original image; and
and the zooming module is used for carrying out digital zooming on the original image so as to output a zoomed image.
8. A terminal comprising an image sensor including a pixel array and a row selector, and a processor for obtaining a zoom position to determine a zoom region; the row selector is used for selecting pixels of a row corresponding to the zooming area in the pixel array to be exposed so as to obtain an original image; the processor is further configured to digitally zoom the raw image to output a zoomed image.
9. The terminal of claim 8, wherein the processor is further configured to:
acquiring the zooming position according to touch operation received by a preview image interface; and
and taking the zooming position as a center, and selecting a region with a preset size as the zooming region.
10. The terminal of claim 8, wherein the processor is further configured to:
acquiring the zoom position and zoom magnification according to touch operation received by a preview image interface;
determining the size of the zooming area according to the zooming magnification; and
and taking the zoom position as the center position of the zoom area to determine the zoom area.
11. The terminal of claim 9 or 10, wherein the processor is further configured to obtain a target zoom region where the zoom region is located in the preview image interface; the row selector is further configured to select pixels of a row in the pixel array corresponding to the target zoom area for exposure, so as to obtain the original image.
12. A terminal according to any of claims 8-10, wherein the image sensor further comprises a readout circuit for reading pixel values of pixels of a column of the exposed pixels corresponding to the zoom area to obtain the original image.
13. The terminal of claim 10, wherein the processor is further configured to interpolate an original image according to a predetermined resolution to output the zoom image.
14. A non-transitory computer-readable storage medium storing a computer program which, when executed by one or more processors, implements the image processing method of any one of claims 1 to 6.
CN202010796963.2A 2020-08-10 2020-08-10 Image processing method and device, terminal and readable storage medium Pending CN111970439A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010796963.2A CN111970439A (en) 2020-08-10 2020-08-10 Image processing method and device, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010796963.2A CN111970439A (en) 2020-08-10 2020-08-10 Image processing method and device, terminal and readable storage medium

Publications (1)

Publication Number Publication Date
CN111970439A true CN111970439A (en) 2020-11-20

Family

ID=73364602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010796963.2A Pending CN111970439A (en) 2020-08-10 2020-08-10 Image processing method and device, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN111970439A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023185127A1 (en) * 2022-03-29 2023-10-05 荣耀终端有限公司 Image processing method and electronic device
CN116939363B (en) * 2022-03-29 2024-04-26 荣耀终端有限公司 Image processing method and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064334A (en) * 2002-07-26 2004-02-26 Mitsubishi Electric Corp Image pick-up apparatus
US20070098386A1 (en) * 2004-10-29 2007-05-03 Sony Corporation Imaging method and imaging apparatus
CN104301605A (en) * 2014-09-03 2015-01-21 北京智谷技术服务有限公司 Imaging control method and device for digital zoom images and imaging equipment
CN105100622A (en) * 2015-08-04 2015-11-25 广州飞米电子科技有限公司 Method, device and electronic equipment for realizing zooming
CN109286750A (en) * 2018-09-21 2019-01-29 重庆传音科技有限公司 A kind of Zooming method and a kind of intelligent terminal based on intelligent terminal
US20190222746A1 (en) * 2016-09-26 2019-07-18 SZ DJI Technology Co., Ltd. Focusing method and apparatus, image photographing method and apparatus, and photographing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064334A (en) * 2002-07-26 2004-02-26 Mitsubishi Electric Corp Image pick-up apparatus
US20070098386A1 (en) * 2004-10-29 2007-05-03 Sony Corporation Imaging method and imaging apparatus
CN104301605A (en) * 2014-09-03 2015-01-21 北京智谷技术服务有限公司 Imaging control method and device for digital zoom images and imaging equipment
CN105100622A (en) * 2015-08-04 2015-11-25 广州飞米电子科技有限公司 Method, device and electronic equipment for realizing zooming
US20190222746A1 (en) * 2016-09-26 2019-07-18 SZ DJI Technology Co., Ltd. Focusing method and apparatus, image photographing method and apparatus, and photographing system
CN109286750A (en) * 2018-09-21 2019-01-29 重庆传音科技有限公司 A kind of Zooming method and a kind of intelligent terminal based on intelligent terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023185127A1 (en) * 2022-03-29 2023-10-05 荣耀终端有限公司 Image processing method and electronic device
CN116939363A (en) * 2022-03-29 2023-10-24 荣耀终端有限公司 Image processing method and electronic equipment
CN116939363B (en) * 2022-03-29 2024-04-26 荣耀终端有限公司 Image processing method and electronic equipment

Similar Documents

Publication Publication Date Title
US10764522B2 (en) Image sensor, output method, phase focusing method, imaging device, and terminal
US10916036B2 (en) Method and system of generating multi-exposure camera statistics for image processing
CN110505411B (en) Image shooting method and device, storage medium and electronic equipment
CN112529775A (en) Image processing method and device
US10270988B2 (en) Method for generating high-dynamic range image, camera device, terminal and imaging method
US8203633B2 (en) Four-channel color filter array pattern
US8237831B2 (en) Four-channel color filter array interpolation
CN110602467B (en) Image noise reduction method and device, storage medium and electronic equipment
CN109005364A (en) Image formation control method, device, electronic equipment and computer readable storage medium
CN107911682B (en) Image white balance processing method, device, storage medium and electronic equipment
CN108419022A (en) Control method, control device, computer readable storage medium and computer equipment
US10999483B2 (en) Display-based camera apparatus and methods
CN112788320B (en) Image sensor, image acquisition device, electronic equipment and control method thereof
EP3836532A1 (en) Control method and apparatus, electronic device, and computer readable storage medium
WO2023124607A1 (en) Image generation method and apparatus, electronic device, and computer-readable storage medium
WO2023082766A1 (en) Image sensor, camera module, electronic device, and image generation method and apparatus
CN113676708A (en) Image generation method and device, electronic equipment and computer-readable storage medium
CN113674685B (en) Pixel array control method and device, electronic equipment and readable storage medium
CN111970439A (en) Image processing method and device, terminal and readable storage medium
CN109479087B (en) Image processing method and device
JP7050432B2 (en) Imaging device
EP4117282A1 (en) Image sensor, imaging apparatus, electronic device, image processing system and signal processing method
JP2023169254A (en) Imaging element, operating method for the same, program, and imaging system
CN115699785A (en) Screen shooting control method, terminal device and storage medium
CN111131714A (en) Image acquisition control method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201120