WO2022213784A1 - 图像处理方法及装置、电子设备和存储介质 - Google Patents
图像处理方法及装置、电子设备和存储介质 Download PDFInfo
- Publication number
- WO2022213784A1 WO2022213784A1 PCT/CN2022/081286 CN2022081286W WO2022213784A1 WO 2022213784 A1 WO2022213784 A1 WO 2022213784A1 CN 2022081286 W CN2022081286 W CN 2022081286W WO 2022213784 A1 WO2022213784 A1 WO 2022213784A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- slice
- area
- processed
- images
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 65
- 238000012545 processing Methods 0.000 claims abstract description 234
- 239000002131 composite material Substances 0.000 claims description 65
- 238000000034 method Methods 0.000 claims description 49
- 230000015654 memory Effects 0.000 claims description 36
- 230000004044 response Effects 0.000 claims description 11
- 230000002194 synthesizing effect Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 45
- 230000008569 process Effects 0.000 description 31
- 238000004891 communication Methods 0.000 description 17
- 238000004140 cleaning Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 10
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000003786 synthesis reaction Methods 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000003062 neural network model Methods 0.000 description 3
- 238000012805 post-processing Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- Embodiments of the present disclosure relate to an image processing method, an image processing apparatus, an electronic device, and a non-transitory computer-readable storage medium.
- the acquired digital image can be restored to an image with real color through color restoration processing.
- the digital image can be a color image, and the image after color restoration processing has real color.
- the digital image can also be It is a black and white image, and the image after color restoration processing is an image with real color corresponding to the black and white image.
- At least one embodiment of the present disclosure provides an image processing method, including: acquiring an initial image; performing slice processing on the initial image to obtain multiple slice images corresponding to the initial image; processing to obtain multiple processed slice images, wherein the tone of the multiple processed slice images remains consistent; according to the positional relationship of the multiple slice images in the initial image, the The sliced images are flattened to obtain a composite image.
- performing tone processing on the plurality of sliced images to obtain a plurality of processed sliced images comprising: determining a processing sequence; and determining a processing sequence based on the processing sequence.
- a reference slice image among the plurality of slice images wherein the reference slice image is the first slice image to be subjected to tone processing determined based on the processing order; and performing tone processing on the reference slice image to obtain The processed reference slice image corresponding to the reference slice image; the color tone of the processed reference slice image is used as the reference color tone; based on the The sliced images are subjected to tone processing to obtain processed sliced images corresponding to all the sliced images, wherein the tone of the processed sliced images corresponding to all the sliced images is consistent with the reference tone, and the multiple processed slice images are consistent with the reference tone.
- the post-slice images include the processed reference slice images and post-processing slice images corresponding to all the slice images respectively.
- tone processing is performed on all slice images of the plurality of slice images except the reference slice image, so as to obtain the
- the processed slice images corresponding to all slice images respectively include: for the i-th slice image in all the slice images: determining a reference area corresponding to the i-th slice image, wherein the reference area is used for all slice images
- a tone reference is provided, and the tone of the reference area is consistent with the reference tone
- tone processing is performed on the i-th slice image to obtain the The processed slice image corresponding to the ith slice image, where i is a positive integer.
- determining the reference area corresponding to the i-th slice image includes: determining a reference area corresponding to the i-th slice image among the plurality of slice images a reference slice image, wherein an area overlapping with the i-th slice image in the reference slice image is an overlapping area, and before the i-th slice image is subjected to tone processing, the reference slice image has been tone-toned processing; obtaining a processed reference slice image corresponding to the reference slice image; determining a processed overlapping area based on the overlapping area and the processed reference slice image, wherein the processed overlapping area is the processed reference the area corresponding to the overlapping area in the sliced image; and determining at least part of the area in the processed overlapping area as the reference area.
- determining at least a part of the processed overlapping area as the reference area includes: responding to an extension of the processed overlapping area in the extending direction.
- the width is equal to the width of the first pixel, and the processed overlap area is used as the reference area; in response to the width of the processed overlap area in the extending direction being greater than the first pixel width, the processed overlap area is selected
- a partial area in the area is used as the reference area, wherein the width of the partial area in the arrangement direction is the width of the first pixel; wherein, the extension direction is the center of the processed overlapping area and the width of the first pixel.
- tone processing is performed on the i-th slice image to obtain a processed slice image corresponding to the i-th slice image , including: determining the area corresponding to the reference area in the i-th slice image as the first area; determining the area other than the first area in the i-th slice image as the second area; The second area and the reference area are combined to obtain the slice image to be processed corresponding to the ith slice image, wherein the size of the slice image to be processed is the same as the size of the ith slice image ; Perform tone processing on the to-be-processed slice image to obtain the processed slice image corresponding to the i-th slice image.
- performing tone processing on the slice image to be processed to obtain a processed slice image corresponding to the ith slice image comprising: based on the slice image to be processed The reference area in the sliced image is processed, and tone processing is performed on the second area in the sliced image to be processed to obtain a processed area corresponding to the second area, wherein the reference area is used for Provide a tone reference for the tone processing of the second area, and the tone of the processed area is consistent with the tone of the reference area; based on the reference area and the processed area, obtain the i-th slice image Corresponding processed slice images.
- performing slice processing on the initial image to obtain multiple slice images corresponding to the initial image includes: determining a slice size; , dividing the initial image into a plurality of slice areas, wherein each slice area and all slice areas adjacent to each slice area at least partially overlap; according to the plurality of slice areas, for all slice areas
- the initial image is sliced to obtain the multiple slice images, wherein the multiple slice images are in one-to-one correspondence with the multiple slice areas, and each slice image includes one slice in the multiple slice areas area.
- Performing stitching processing on the plurality of processed slice images to obtain a composite image includes: performing stitching processing on the plurality of processed slice images according to the positional relationship of the plurality of slice images in the initial image to obtain a composite image.
- an intermediate composite image wherein all the pixels in the intermediate composite image are arranged in n rows and m columns; in the case where the t1th row and the t2th column in the intermediate composite image only include one pixel, it will be located in the t1th row
- the pixel value of the one pixel in the t2-th column is taken as the pixel value of the pixel in the t1-th row and t2-column in the composite image; in the case where a plurality of pixels are included in the t1-th row and t2-column in the intermediate composite image
- acquiring the initial image includes: acquiring the original image; adding a preset border along the edge of the original image to a side away from the center of the original image, to obtaining the initial image, wherein the initial image includes the original image and the preset border, and the color of the preset border is a preset color; and before performing tone processing on the plurality of sliced images, the The method further includes: performing edge removal processing on the plurality of slice images.
- adding a preset border along the edge of the original image to a side away from the center of the original image to obtain the original image comprising: determining The edge of the original image; the preset frame is generated based on the edge of the original image, wherein the preset frame includes a first edge and a second edge; the preset frame is set at the edge of the original image.
- the side of the edge away from the center of the original image to obtain the initial image wherein the first edge of the preset frame overlaps with the edge of the original image, and the second edge of the preset frame overlaps with the edge of the original image.
- the edges of the original image are separated by a second pixel width, and the edges of the original image are the second edges.
- each slice image includes four slice edges
- performing edge removal processing on the plurality of slice images includes: in response to detecting the each slice image The i-th slice edge is a part of the second edge, and edge removal processing is performed on the i-th slice edge, in response to detecting that the i-th slice edge of each slice image is not the second
- edge removal processing is not performed on the i-th slice edge, where i is a positive integer and less than or equal to 4.
- the composite image is a color image.
- At least one embodiment of the present disclosure provides an image processing apparatus, including: an acquisition unit configured to acquire an initial image; and a slice processing unit configured to perform slice processing on the initial image to obtain multiple slices corresponding to the initial image an image; a tone processing unit configured to perform tone processing on the plurality of sliced images to obtain a plurality of processed sliced images, wherein the tone of the multiple processed sliced images is consistent; a synthesis unit, configured to The positional relationship of the plurality of sliced images in the initial image is performed, and the plurality of processed sliced images are combined to obtain a composite image.
- At least one embodiment of the present disclosure provides an electronic device, comprising: a memory non-transitory storing computer-executable instructions; a processor configured to execute the computer-executable instructions, wherein the computer-executable instructions are The image processing method according to any embodiment of the present disclosure is implemented when the processor is running.
- At least one embodiment of the present disclosure provides a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer-executable instructions that, when executed by a processor, implement a The image processing method described in any embodiment of the present disclosure.
- FIG. 1 is a schematic flowchart of an image processing method provided by at least one embodiment of the present disclosure
- FIG. 2 is a schematic diagram of an initial image provided by at least one embodiment of the present disclosure
- 3A to 3E are schematic diagrams of an image slice processing process provided by at least one embodiment of the present disclosure.
- FIG. 4A is a schematic flowchart of step S30 in the image processing method shown in FIG. 1;
- FIG. 4B is a schematic flowchart of step S305 in the image processing method shown in FIG. 4A;
- 4C is a schematic diagram of slice image A and slice image B provided by an embodiment of the present disclosure.
- 4D is a schematic diagram of a slice image B and a slice image C provided by an embodiment of the present disclosure
- 4E is a schematic diagram of a slice image to be processed provided by an embodiment of the present disclosure.
- 5A-5D are schematic diagrams of slice images to be processed and corresponding slice images after processing provided by at least one embodiment of the present disclosure
- 6A is a schematic diagram of an intermediate composite image provided by at least one embodiment of the present disclosure.
- 6B-6E are schematic diagrams of a processing process of a composite image provided by at least one embodiment of the present disclosure.
- 6F is a schematic diagram of a composite image provided by an embodiment of the present disclosure.
- FIG. 7A is a schematic diagram of an initial image provided by an embodiment of the present disclosure.
- FIG. 7B is a schematic diagram of a processed image provided by an embodiment of the present disclosure.
- FIG. 7C is a schematic diagram of a slice image provided by an embodiment of the present disclosure.
- FIGS. 7D-7F are schematic diagrams of an image processing process including edge removal processing provided by an embodiment of the present disclosure.
- FIGS 8A-8C are schematic diagrams of an image processing process provided by an embodiment of the present disclosure
- FIG. 9 is a schematic block diagram of an image processing apparatus according to at least one embodiment of the present disclosure.
- FIG. 10 is a schematic diagram of an electronic device according to at least one embodiment of the present disclosure.
- FIG. 11 is a schematic diagram of a non-transitory computer-readable storage medium provided by at least one embodiment of the present disclosure.
- FIG. 12 is a schematic diagram of a hardware environment provided by at least one embodiment of the present disclosure.
- a color restoration model can be used for processing, for example, a color restoration model is used to process a color image into an image with real color, or a black and white image is processed into a color image image. Since the color restoration model needs to occupy a relatively large memory, the processing rate of color restoration will be reduced, or, in some examples, the image is processed by the mobile terminal, and the memory occupied by the color restoration model may exceed the memory of the mobile terminal. memory and cannot perform image processing.
- an image slicing method can be used to divide the original image into multiple sliced images, and perform color restoration processing on each sliced image respectively, so as to reduce the memory resources occupied during the color restoration processing.
- the color restoration model cannot obtain the global information of the digital image, the problem of inconsistency of tones between sliced images after performing color restoration processing will occur.
- At least one embodiment of the present disclosure provides an image processing method, an image processing apparatus, an electronic device, and a non-transitory computer-readable storage medium.
- the image processing method includes: acquiring an initial image; and slicing the initial image to obtain the initial image Corresponding multiple slice images; color tone processing is performed on the multiple slice images to obtain multiple processed slice images, wherein the tone of the multiple processed slice images remains consistent; according to the positional relationship of the multiple slice images in the initial image , perform stitching processing on multiple processed slice images to obtain a composite image.
- the image processing method performs tone processing on a plurality of sliced images respectively, so that the tone of the multiple sliced images after processing is consistent, so that the finally obtained composite image is an image of uniform tone.
- the image processing method provided by the embodiments of the present disclosure can be applied to a mobile terminal (eg, a mobile phone, a tablet computer, etc.), and on the basis of improving the processing speed, the color tone of the multiple images obtained after processing is kept uniform, and the mobile The images collected by the terminal undergo real-time color restoration processing.
- a mobile terminal eg, a mobile phone, a tablet computer, etc.
- the image processing method provided by the embodiment of the present disclosure can be applied to the image processing apparatus provided by the embodiment of the present disclosure, and the image processing apparatus can be configured on an electronic device.
- the electronic device may be a personal computer, a mobile terminal, etc.
- the mobile terminal may be a hardware device with various operating systems, such as a mobile phone, a tablet computer, and the like.
- FIG. 1 is a schematic flowchart of an image processing method provided by at least one embodiment of the present disclosure.
- FIG. 2 is a schematic diagram of an initial image provided by at least one embodiment of the present disclosure.
- the image processing method provided by at least one embodiment of the present disclosure includes steps S10 to S40.
- Step S10 acquiring an initial image.
- Step S20 performing slice processing on the initial image to obtain multiple slice images corresponding to the initial image.
- Step S30 performing tone processing on the plurality of sliced images to obtain a plurality of processed sliced images, wherein the tone of the multiple processed sliced images is consistent.
- Step S40 according to the positional relationship of the plurality of sliced images in the initial image, stitching processing is performed on the plurality of sliced images after processing to obtain a composite image.
- the initial image acquired in step S10 may include at least one object, the object may be a character, and the character may include Chinese (eg, Chinese characters or pinyin), English, Japanese, French, Korean, Latin, numbers, etc., in addition, the object Various symbols (eg, greater than symbols, less than symbols, percent signs, etc.), various graphics and images (hand drawn or printed), etc., may also be included. At least one of the objects may include printed or machine-typed characters, as well as handwritten characters.
- the objects in the initial image may include words and letters in print (eg, English, Japanese, French, Korean, German, Latin, etc. in different national languages and scripts) ), printed numbers (eg, dates, weights, dimensions, etc.), printed symbols and images, etc., handwritten words and letters, handwritten numbers, handwritten symbols and graphics, etc.
- the initial image may be various types of images, for example, may be an image of a shopping list, an image of a dining receipt, an image of a test paper, an image of a contract, an image of a painting, and the like. As shown in FIG. 2, the initial image may be an image of a letter.
- the shape of the initial image may be a rectangle or the like.
- the shape and size of the initial image can be set by the user according to the actual situation.
- the initial image may be an image captured by an image acquisition device (eg, a digital camera or a mobile phone, etc.), and the initial image may be a grayscale image or a color image.
- the initial image refers to a form in which an object is visually presented, such as a picture.
- the initial image can also be obtained by scanning or the like.
- the initial image may be an image directly collected by an image collection device, or may be an image obtained after preprocessing the collected image.
- the image processing method provided by at least one embodiment of the present disclosure may further include an operation of preprocessing the initial image.
- the preprocessing may include, for example, processing, such as cropping, gamma (Gamma) correction, or noise reduction filtering, on the image directly collected by the image collection device. Preprocessing can eliminate irrelevant information or noise information in the initial image, so as to facilitate the subsequent image processing of the initial image.
- processing such as cropping, gamma (Gamma) correction, or noise reduction filtering
- step S20 may include: determining a slice size; according to the slice size, dividing the initial image into a plurality of slice areas, wherein each slice area and all slice areas adjacent to each slice area at least partially overlap each other; perform slicing processing on the initial image according to multiple slice areas to obtain multiple slice images, wherein the multiple slice images correspond to the multiple slice areas one-to-one, and each slice image includes multiple slice areas A sliced area in .
- the slice size is the actual slice size, that is, the size of the sliced image after the initial image is sliced.
- the width and height corresponding to the slice size may be equal, or the width and height corresponding to the slice size may also be unequal.
- the slice size may be 576 ⁇ 576, but the present disclosure is not limited thereto, and the slice size may be set according to actual conditions.
- the unit of size is pixels, that is, for example, the slice size is 576 ⁇ 576, which means that the slice size is 576 pixels ⁇ 576 pixels.
- adjacent may include adjacent on the upper side, adjacent on the left side, adjacent on the right side and adjacent on the lower side, that is, the slicing area adjacent to a certain slicing area includes the upper side of the certain slicing area
- the adjacent slice area that is, the slice area is located on the upper side of the certain slice area
- the slice area adjacent to the left side of the slice area that is, the slice area is located on the left side of the certain slice area
- the slice area adjacent to the slice area that is located on the left side of the certain slice area
- the slicing area adjacent to the right side of the slicing area that is, the slicing area is located on the right side of the certain slicing area
- the slicing area adjacent to the lower side of the slicing area that is, the slicing area is located on the lower side of the certain slicing area
- the overlapping portion between two adjacent slice regions has at least a first pixel width in a first direction, for example, the first direction is a direction of a line connecting the centers of the two adjacent slice regions.
- the first pixel width may be set as required, for example, the first pixel width may be any value in the range of 20 pixels-50 pixels, for example, 32 pixels.
- multiple slice regions can be sequentially determined.
- 3A to 3E are schematic diagrams of an image slice processing process provided by at least one embodiment of the present disclosure. As shown in FIGS. 3A to 3E , in some embodiments, the size of the initial image is 1660 ⁇ 1660, the slice size is 576 ⁇ 576, and the first pixel width is 32 pixels.
- the slice area A takes the upper left corner of the initial image as a vertex, and takes the left side and the upper side of the initial image as the boundary, based on the slice size determined area.
- the slice area B based on the slice area A.
- the direction of the line connecting the center of the slice area A and the center of the slice area B is the horizontal direction
- the connection line between the center of the slice area A and the center of the slice area B is the same as the initial
- the top sides of the image are parallel.
- the upper side of the slice area B overlaps with the upper side of the initial image, and the left side of the slice area B and the right side of the slice area A form an overlapping area, and the overlapping area is shown as the shaded part P1 in FIG. 3A , the overlapping area
- the width of the area P1 in the horizontal direction is 32 pixels
- the size of the slice area B is also 576 ⁇ 576.
- the slice area C is determined based on the slice area B. Since the width of the initial image is 1660 pixels, the distance between the right side of the slice area B and the right side of the initial image is 540 (1660-(576+576-32)) pixels. The side edge and the left edge of the slice area C constitute an overlapping area with a width of 32 pixels in the horizontal direction to determine the slice area C, which will make the slice area C exceed the range of the original image. Therefore, at this time, the slice area C can be determined based on the upper right corner vertex of the initial image.
- the slice area C takes the upper right corner vertex of the initial image as one vertex, and takes the right side and the upper side of the initial image as the boundary, The area determined based on the slice size.
- the horizontal width of the overlapping area formed by the right side of the slicing area B and the left side of the slicing area C is 36 pixels, and the overlapping area is shown by the shaded part P2 in FIG. 3B .
- the direction of the line connecting the center of slice area B and the center of slice area C is the horizontal direction, and the line between the center of slice area B and the center of slice area C is parallel to the upper side of the initial image, that is, the center of slice area A. , the center of slice area B and the center of slice area C are on the same straight line.
- the slice area D is determined based on the slice area A.
- the line connecting the center of the slice area A and the center of the slice area D is the vertical direction, and the center of the slice area A and the center of the slice area D are in the vertical direction. The line between them is parallel to the right side of the original image.
- the left side of the slice area D overlaps with the left side of the initial image, and the upper side of the slice area D and the lower side of the slice area A form an overlapping area, and the overlapping area is shown as the shaded part P3 in FIG. 3C , the overlapping area
- the width of the region P3 in the vertical direction is 32 pixels.
- the size of the slice area D is also 576 ⁇ 576.
- the horizontal direction and the vertical direction are perpendicular to each other.
- the width of the overlapping area P1 in the vertical direction is 576 pixels
- the width of the overlapping area P3 in the horizontal direction is 576 pixels
- the overlapping area P1 and the overlapping area P3 partially overlap each other
- the size of the portion where the P1 and the overlapping region P3 overlap each other is 32 ⁇ 32.
- the slice area E is determined based on the slice area D and the slice area B.
- the direction of the line connecting the center of the slice area D and the center of the slice area E is the horizontal direction, and the center of the slice area D and the slice area are in the horizontal direction.
- the line connecting the centers of E is parallel to the upper side of the initial image.
- the direction of the line connecting the center of slice area B and the center of slice area E is the vertical direction, and the center of slice area B and the center of slice area E are in the vertical direction. The line between them is parallel to the left edge of the original image.
- the left side of the slicing area E and the right side of the slicing area D form an overlapping area, and the overlapping area is shown as the shaded part P4 in FIG. 3D
- the width of the overlapping area P4 in the horizontal direction is 32 pixels
- the width in the vertical direction is 576 pixels
- the upper side of the slicing area E and the lower side of the slicing area B constitute an overlapping area
- the overlapping area is shown as the shaded part P5 in FIG.
- the width in the direction is 32 pixels
- the width of the overlapping area P5 in the horizontal direction is 576 pixels
- the size of the slice area E is also 576 ⁇ 576.
- the slicing area F to the slicing area I are sequentially determined according to the above method, the specific process will not be repeated, and finally a schematic diagram of all the slicing areas as shown in FIG. 3E is obtained.
- the size of each slice area is 576 ⁇ 576, there is at least an overlap area between each slice area and all slice areas adjacent to each slice area, and the width of the overlap area in the extension direction is at least is 32 pixels.
- the slicing area adjacent to the slicing area E includes slicing area B, slicing area D, slicing area F, and slicing area H, and there is a width between slicing area E and slicing area B in the first direction (ie, the vertical direction) of 32 pixel and the overlapping area Q1 with a width of 576 pixels in the horizontal direction, between the slice area E and the slice area D, there is a width of 32 pixels in the second direction (ie, the horizontal direction) and a width of 576 pixels in the vertical direction.
- the overlapping area Q2, the slicing area E and the slicing area F have an overlapping area Q3 with a width of 36 pixels in the second direction (ie, the horizontal direction) and a width of 576 pixels in the vertical direction, the slicing area E and the slicing area There is an overlapping area Q4 between H with a width of 36 pixels in the first direction (ie, the vertical direction) and a width of 576 pixels in the horizontal direction.
- the order of determining the slicing regions in the above example is only an exemplary order, and the order of determining the slicing regions in the present disclosure can be set as required, and different sequences may generate different overlapping regions, which are not made in this disclosure. limit.
- the slice regions may be determined sequentially from the right side of the initial image.
- the width of the overlapping area between the slice area B and the slice area C in the second direction may be 36 pixels, the width of the overlapping area between the slice area B and the slice area A in the second direction may be 32 pixels.
- the initial image may be sliced according to the slice area shown in FIG. 3E to obtain slice images A-slice images I one-to-one corresponding to slice area A-slice area I, and the size of each slice image is 576 ⁇ 576.
- FIG. 4A is a schematic flowchart of step S30 in the image processing method shown in FIG. 1 .
- step S30 in the image processing method may specifically include steps S301 to S305.
- Step S301 determining the processing sequence.
- Step S302 based on the processing order, determine a reference slice image among the plurality of slice images, wherein the reference slice image is the first slice image to be subjected to tone processing determined based on the processing order.
- Step S303 performing tone processing on the reference slice image to obtain a processed reference slice image corresponding to the reference slice image.
- step S304 the hue of the processed reference slice image is used as the reference hue.
- Step S305 performing tone processing on all the slice images except the reference slice image in the plurality of slice images based on the reference tone, so as to obtain processed slice images corresponding to all the slice images respectively.
- step S305 the tone of the processed slice images corresponding to all slice images is consistent with the reference tone, and the multiple processed slice images include the processed reference slice image and the processed slice images corresponding to all slice images respectively.
- the processing order in step S301 may be a predetermined order.
- any slice image may be selected as the first slice image to be subjected to tone processing, that is, the reference slice image, and then the slice image adjacent to the reference slice image may be selected. Any slice image is processed, and so on, to determine the processing order.
- the processing sequence may be A-B-C-D-E-F-G-H-I, and the reference slice image is slice image A at this time, and the color tone of the processed slice image corresponding to slice image A
- tones are performed on the slice image B-slice image I based on the reference hue, so that the hues of the processed slice images corresponding to the slice image B-slice image I are consistent with the reference hue
- the processing sequence can also be C-B-A-D-E-F-I-H-G, the reference slice image is slice image C at this time, and the tone of the processed slice image corresponding to slice image C is used as the reference tone, so that slice image A, slice image B, slice image D-slice image I are toned based on the reference tone processing, so that the tones of the processed slice images corresponding to slice image A, slice image B, slice image D-slice image
- a reference tone is first determined, that is, the tone of the processed reference slice image is selected as the reference tone, and then based on the reference tone Tone processing is performed on other sliced images, so that the tone of the processed sliced images corresponding to the other sliced images is consistent with the reference tone, so that the tone of all processed sliced images is consistent, and the final composite image is of uniform tone. image.
- FIG. 4B is a schematic flowchart of step S305 in the image processing method shown in FIG. 4A .
- step S305 may specifically include steps S3051 to S3052.
- step S3051 a reference area corresponding to the ith slice image is determined, wherein the reference area is used to provide a tone reference for the ith slice image during tone processing, and the tone of the reference area is consistent with the reference tone.
- step S3051 may include: determining a reference slice image corresponding to the ith slice image in the plurality of slice images, wherein an area overlapping with the ith slice image in the reference slice image is an overlapping area, And before the i-th slice image is subjected to tone processing, the reference slice image has been subjected to tone processing; the processed reference slice image corresponding to the reference slice image is obtained; based on the overlapping area and the processed reference slice image, the processed overlapping area is determined, wherein , the processed overlapping area is the area corresponding to the overlapping area in the processed reference slice image; at least part of the processed overlapping area is determined as the reference area.
- determining at least a part of the processed overlapping area as the reference area may include: in response to the width of the processed overlapping area in the extending direction being equal to the width of the first pixel, using the processed overlapping area as the reference area; in response to the processed overlapping area being the reference area;
- the width of the overlapping area in the extending direction is greater than the width of the first pixel, and a partial area in the overlapping area after processing is selected as the reference area, wherein the width of the partial area in the arrangement direction is the width of the first pixel; for example, the extending direction is the processing The direction of the line connecting the center of the overlapping area and the center of the area other than the processed overlapping area in the processed reference slice image.
- the reference area is designated as a reference area with a fixed width, so that the uniformity of the processing parameters is maintained and the image processing efficiency is improved when the tonal processing is performed on the sliced image.
- the reference slice image needs to satisfy two conditions: first, the slice area corresponding to the reference slice image needs to at least partially overlap with the slice area corresponding to the ith slice image to be processed; Before processing, the reference slice image has been toned.
- the reference slice image is slice image A; for slice image B, its reference slice image can be slice image A; for slice image C, its reference slice image can be slice image B; for slice image D, its reference slice image can be slice image A; for slice image E, its reference slice image can be slice image B or slice image D; for slice image F, its reference slice image can be slice image E or slice image C, And so on.
- FIG. 4C is a schematic diagram of slice image A and slice image B provided by an embodiment of the present disclosure. Since the slicing area A and the slicing area B have overlapping shaded areas P1 when the initial image is divided into slice areas (as shown in FIG. 3A ), the slice image A corresponding to the slicing area A and the slice image corresponding to the slicing area B In B, the pixels in the part corresponding to the shaded area P1 have the same pixel value, that is, the pixels at the same position in the shaded area O1 and the shaded area O2 in FIG. 4C have the same pixel value.
- the "area that overlaps with the ith slice image in the reference slice image" can be In the shaded area O1 shown in FIG. 4C , the processed overlapping area may be the part corresponding to the shaded area O1 in the processed sliced image A′ corresponding to the sliced image A.
- the width of the processed overlapping area in the extension direction is For the first pixel width 32, all the processed overlapping areas can be selected as the reference area corresponding to the slice image B.
- FIG. 4D is a schematic diagram of a slice image B and a slice image C provided by an embodiment of the present disclosure.
- the ith slice image is slice image C
- the reference slice image of slice image C is slice image B
- the first pixel width is 32
- the "area that overlaps with the ith slice image in the reference slice image" can be The area O3 defined by the thick line frame in FIG. 4D;
- the processed overlapping area is the part corresponding to the area O3 in the processed sliced image B' corresponding to the sliced image B, for example, due to the width of the processed overlapping area in the extending direction If the width is greater than the first pixel width, the partially processed overlapping area can be selected as the reference area corresponding to the slice image C.
- an area starting from the left edge of the slice image C and having a width of the first pixel width in the extending direction can be selected. (that is, the shaded area R2 in FIG. 4D ), the corresponding part of the overlapping area after processing is used as the reference area, that is, starting from the left edge of the overlapping area after processing, and the width in the extending direction is the first
- An area with a width of one pixel is used as the reference area, that is, the part corresponding to the shaded area R1 in FIG. 4D in the processed slice image B' corresponding to the slice image B is selected as the reference area.
- step S3052 based on the reference area, tone processing is performed on the ith slice image to obtain a processed slice image corresponding to the ith slice image, for example, i is a positive integer.
- step S3052 may include: determining the area corresponding to the reference area in the ith slice image as the first area; determining the area other than the first area in the ith slice image as the second area The second area and the reference area are combined and processed to obtain the slice image to be processed corresponding to the ith slice image, wherein, the size of the slice image to be processed is the same as the size of the ith slice image; The slice image to be processed is subjected to tone processing , to obtain the processed slice image corresponding to the ith slice image.
- the i-th slice image is slice image B
- the reference area is the area corresponding to the shaded part O1 in the processed slice image A'
- the first area is the shaded area O2 in slice image B
- the second area is the part other than the shadow area O2 in the sliced image B
- the second area and the reference area are combined to obtain the sliced image to be processed corresponding to the sliced image B.
- the schematic diagram of the sliced image to be processed is shown in Figure 4E
- the reference area (the area defined by the dotted frame in FIG. 4E ) and the second area are combined into a slice image to be processed according to the positional relationship between the first area and the second area.
- the ith slice image is slice image C
- the reference area is the area corresponding to the shaded part R1 in the processed slice image B' (the width in the extending direction is 32 pixels)
- the first area is the shaded area R2 in the sliced image C
- the second area is the part (with a width of (576-32) pixels) other than the shaded area R2 in the sliced image C
- the second area and the reference area are combined to obtain the sliced image
- the slice image to be processed corresponding to C For example, the schematic diagram of the slice image to be processed is shown in FIG. 4E . The positional relationship is stitched into the slice image to be processed.
- the slice image E corresponding to the slice area E shown in FIG. 3E
- the processing sequence is A-B-C-D-E-F-G-H-I
- the slice image E The reference slice image may be slice image D or slice image B.
- the first reference area may be determined based on slice image D
- the second reference area may be determined based on slice image B
- the determination method of the reference area is as described above, and will not be repeated here; after that, the area in the slice image E except the area corresponding to the first reference area and the second reference area is used as the second area.
- the reference area and the second area are combined to obtain the slice image to be processed.
- tone processing can be performed on the slice image to be processed according to the first reference area, that is, the first reference area can be used to provide tone reference when performing tone processing on the slice image to be processed; alternatively, the slice image to be processed can also be processed according to the second reference area
- the image is subjected to tone processing, that is, the second reference area is used to provide a tone reference when the slice image to be processed is subjected to tone processing;
- a reference area and a second reference area provide a tone reference for the to-be-processed slice image during tone processing. It should be noted that the first reference area and the second reference area partially overlap.
- the image data of the reference area in the reference slice image can also be extracted during the tone processing, the corresponding part in the i-th slice image is replaced with the image data, and the replaced slice image is used as the to-be-processed image data.
- the image is sliced to perform tone processing, which is not limited by the present disclosure.
- performing tone processing on the slice image to be processed to obtain the processed slice image corresponding to the ith slice image may include: performing tone processing on the second area in the slice image to be processed based on the reference area in the slice image to be processed, To obtain a processed area corresponding to the second area, wherein the reference area is used to provide a tone reference for the tone processing of the second area, and the tone of the processed area is consistent with the tone of the reference area; based on the reference area and the processed area, Obtain the processed slice image corresponding to the ith slice image.
- the reference area and the processed area may be combined into a processed slice image according to the positional relationship between the reference area and the second area.
- toning processing of slice images to be processed can be achieved by a pre-trained color restoration processing model.
- the color restoration processing model can be trained based on a neural network model.
- the neural network model can be a model with a U-Net structure. For example, a large number of samples of the original image and the color-edited rendering of the original image can be used for training. The data is used to train the neural network model to establish a color restoration processing model.
- 5A-5D are schematic diagrams of slice images to be processed and corresponding slice images after processing provided by at least one embodiment of the present disclosure.
- the slice image to be processed is obtained by executing the image processing method provided by at least one embodiment of the present disclosure on the initial image shown in FIG. 2 .
- the distribution of the slicing areas can be as shown in FIG. 3E
- the processing sequence can be A-B-C-D-E-F-G-H-I, that is, by performing the slicing process on the initial image shown in FIG. 2 Step S20, obtaining slice images A and slice images I respectively corresponding to slice areas A and slice areas I in FIG. 3E .
- the slice image A to be processed is the slice image A
- the slice image A to be processed is subjected to tone processing to obtain the processed slice image A' (shown in the dotted box on the right side of FIG. 5A ).
- the slice image B to be processed includes a reference area and a second area, wherein the reference area (shown by the solid line frame in FIG. 5B ) is a partial area in the processed slice image A in FIG. 5A ,
- the second area (indicated by the dotted box in FIG. 5B ) is a partial area in the sliced image B, and the determination method and combination method of the reference area and the second area are as described above, and will not be repeated here.
- the to-be-processed slice image B is subjected to tone processing to obtain a processed slice image B' (shown by the dotted box on the right in Fig. 5B ).
- the slice image C to be processed includes a reference area and a second area, wherein the reference area (shown by the solid line frame in FIG. 5C ) is a partial area in the slice image B after processing in FIG. 5B ,
- the second area (indicated by the dotted box in FIG. 5C ) is a partial area in the slice image C, and the determination and combination methods of the reference area and the second area are as described above, and will not be repeated here.
- the to-be-processed slice image C is subjected to tone processing to obtain a processed slice image C' (shown by the dotted box on the right in Fig. 5C ).
- the slice image D to be processed includes a reference area and a second area, wherein the reference area (indicated by the solid line frame in FIG. 5D ) is a partial area in the processed slice image A in FIG. 5A ,
- the second area (indicated by the dotted box in FIG. 5D ) is a partial area in the slice image D, and the determination method and combination method of the reference area and the second area are as described above, and will not be repeated here.
- Tone processing is performed on the sliced image D to be processed to obtain the processed sliced image D' (shown by the dotted box on the right in Fig. 5D ).
- the reference tone is first determined based on the first processed slice image, and then other slice images are processed based on the reference tone.
- the tone processing is performed to ensure that the tone of each sliced image is consistent with the tone of the reference area, and the tone of the reference area is consistent with the reference tone, so that all sliced images have a uniform tone.
- step S40 may include: according to the positional relationship of the multiple slice images in the initial image, performing a stitching process on the multiple processed slice images, to obtain an intermediate composite image, in which all the pixels in the intermediate composite image are arranged in n rows and m columns; in the case where only one pixel is included in the t1th row and t2th column of the intermediate composite image, it will be located in the t1th row and t2th column.
- the pixel value of one pixel is taken as the pixel value of the pixel in the t1th row and t2th column of the composite image; in the case where the t1th row and t2th column in the intermediate composite image includes a plurality of pixels, select any pixel from the plurality of pixels.
- the pixel value of one pixel is used as the pixel value of the pixel at row t1 and column t2 in the composite image, where n, m, t1, and t2 are all positive integers, and t1 is less than or equal to n, and t2 is less than or equal to m.
- the composite image may be a color image, for example, the pixel values of the pixels in the color image may include a set of RGB pixel values, or the composite image may also be a monochrome image, for example, the pixel values of the pixels in the monochrome image may is the pixel value of a color channel.
- FIG. 6A is a schematic diagram of an intermediate composite image provided by at least one embodiment of the present disclosure.
- A'-I' in Fig. 6A respectively represent the processed slice image A'-processed slice image I' corresponding to the slice area A-slice area I shown in Fig. 3E, and the size of each processed slice image Both are 576 ⁇ 576, the same size as the slice area.
- the shaded area in FIG. 6A represents the overlapping area between the multiple processed slice images due to the overlap between the slice areas during the flattening process.
- the width of each overlapping area in the extension direction corresponds to the corresponding overlapping area.
- the regions have the same width in the first direction.
- the pixel at row t1 and column t2 in the intermediate composite image is point q1 in FIG. 6A
- point q1 is the pixel between the processed slice image A' and the processed slice image B' A certain pixel in the overlapping area
- the t1th row and t2th column (ie point q1) in the intermediate composite image includes 2 pixels
- the 2 pixels are the pixels at point q1 in the slice image A' and the pixel at point q1 in slice image B'.
- the pixel value of any one of the two pixels may be selected as the pixel value of the pixel located at the t1-th row and the t2-th column in the composite image.
- step S30 when the reference slice image corresponding to slice image B is slice image A, for the processed slice image B', the overlapping area is the reference area corresponding to slice image B, and the reference area comes from In the sliced image A' after processing, the pixel values of the two pixels are the same, and both are the pixel values of the pixel at point q1 in the sliced image A' after processing.
- the pixel located at row t1 and column t2 in the intermediate composite image is point q2 in FIG. 6A
- point q2 is the processed slice image A', the processed slice image B', the processed slice image A certain pixel in the overlapping area between the post-slice image D' and the post-processing slice image E'
- the t1th row and t2th column (that is, the q2 point) in the intermediate composite image includes 4 pixels
- the 4 The pixels are the pixel at point q2 in the processed slice image A', the pixel at point q2 in the processed slice image B', the pixel at point q2 in the processed slice image D', and the processed slice image The pixel at point q2 in E'.
- the pixel value of any one of the four pixels may be selected as the pixel value of the pixel located at the t1-th row and the t2-th column in the composite image.
- the reference slice image corresponding to slice image B is slice image A
- the reference slice image corresponding to slice image D is slice image A
- the reference slice image corresponding to slice image E is slice image D
- Point q2 belongs not only to the reference area of slice image B, but also to the reference area of slice image D and slice image E, so the pixel values of the four pixels are the same, and they are all pixels at point q2 in slice image A' after processing. pixel value.
- the pixel located at row t1 and column t2 in the intermediate composite image is point q3 in FIG. 6A
- point q3 is the overlap between the processed slice image B' and the processed slice image C' A certain pixel in the area
- the intermediate composite image at row t1 and column t2 includes 2 pixels, which are respectively at point q3 in the processed slice image B'.
- pixel, the pixel at point q3 in the processed slice image C' the pixel value of any one of the two pixels may be selected as the pixel value of the pixel located at the t1-th row and the t2-th column in the composite image.
- the reference slice image of slice image C is slice image B
- the reference area of slice image C is a partially processed overlapping area
- the corresponding part in the processed slice image B' corresponding to the image B is used as a reference area, so in FIG. 6A, there may be two different pixel values in the pixels of the area defined by the dotted frame in the intermediate composite image, for example,
- the pixel value of the pixel in the processed slice image corresponding to the reference slice image can be preferentially selected as the pixel value of the pixel in the t1th row and the t2th column in the composite image, that is, the processed image is selected here.
- the pixel value of the pixel located at the t1-th row and the t2-th column in the slice image B' is taken as the pixel value of the pixel at the t1-th row and the t2-column in the composite image.
- the pixel located at row t1 and column t2 is point q4 in FIG. 6A
- point q4 is a certain pixel in the non-overlapping area.
- the pixel located at t1 in the intermediate composite image Row t2, column t2 includes one pixel.
- the pixel value of this one pixel is taken as the pixel value of the pixel located at the t1-th row and the t2-th column in the composite image.
- At least part of the multiple processed sliced images may be extracted in sequence according to the processing order to obtain multiple images to be spliced, and then, according to the positional relationship of the multiple sliced images in the initial image , splicing multiple images to be spliced to obtain a composite image.
- each image to be spliced is the part of the corresponding processed sliced image except for the processed overlapping area.
- slice image A is the reference slice image
- the to-be-spliced image A′′ corresponding to slice image A is the complete processed slice image A'.
- the reference slice image of slice image B is slice image A
- the image to be spliced B′′ corresponding to slice image B is an area other than the processed overlapping area in the processed slice image B'.
- the processed slice image B', the overlapping area after processing is the area corresponding to the overlapping area in the processed reference slice image A
- the image to be spliced B' is the black thick line frame in Fig. 6B
- the size of the image to be stitched B′′ is 544 ⁇ 576 ((576-32) ⁇ 576).
- the reference slice image of slice image C is slice image B, so the to-be-spliced image C′′ corresponding to slice image C is an area other than the processed overlapping area in the processed slice image C'.
- the processed slice image C', the overlapping area after processing is the area corresponding to the overlapping area in the processed reference slice image B, and the image C" to be spliced is the frame in Fig. 6C by a thick black line
- the size of the image C ⁇ to be stitched is 540 ⁇ 576 ((576-36) ⁇ 576).
- the reference slice image of slice image D is slice image A
- the image to be spliced D′′ corresponding to slice image D is the area other than the processed overlapping area in the processed slice image D'.
- the processed slice image D', the overlapping area after processing is the area corresponding to the overlapping area in the processed reference slice image A
- the image to be spliced D' is the black thick line frame in Fig. 6D
- the size of the image to be stitched D′′ is 576 ⁇ 544 (576 ⁇ (576-36)).
- the image to be spliced E′′ to the image to be spliced I′′ are sequentially determined according to the above method, and the specific process will not be repeated.
- a composite image is obtained by stitching the plurality of images to be stitched.
- FIG. 6E is a schematic diagram of a composite image provided by an embodiment of the present disclosure.
- a ⁇ - I ⁇ respectively represent the image to be spliced
- a ⁇ - the image to be spliced I ⁇ corresponding to the slice area A-slice area I shown in FIG. 3E , and each image to be spliced
- the determination method is as described above, and the sizes of each image to be spliced are not exactly the same.
- FIG. 6F is a schematic diagram of a composite image provided by an embodiment of the present disclosure.
- FIG. 6F is a composite image obtained by performing the image processing method provided by at least one embodiment of the present disclosure on the initial image shown in FIG. 2 .
- the composite image is a color image.
- FIG. 7A is a schematic diagram of an initial image provided by an embodiment of the disclosure
- FIG. 7B is a schematic diagram of a processed image provided by an embodiment of the disclosure
- FIG. 7C is a schematic diagram of a slice image provided by an embodiment of the disclosure.
- edge cleaning of the slice image to be processed is required to remove other images at the edge of the image that are not part of the image content.
- the area defined by the black borders on the upper, lower and left sides of the initial image includes black borders, which are not part of the image content, so the initial image needs to be processed
- Edge cleaning is performed to remove this part of the black edge, and the processed image as shown in FIG. 7B is obtained, and the processed image only contains the image content without the black edge in the original image.
- the sliced image obtained after slicing the initial image may be mistakenly cleaned during edge cleaning.
- the area bounded by the black dashed box is likely to be removed as other images that do not belong to the image content during the edge cleaning process.
- the original image to be processed can be marked with a border to obtain the initial image, so that only the border with the mark is cleaned, and the border without the mark is not edged. Clean up to avoid the edges of sliced images from being mistakenly cleaned up.
- step S10 may include: acquiring an original image; adding a preset border along the edge of the original image to a side far from the center of the original image to obtain an initial image, wherein the initial image includes an original image and a preset border, and the preset border
- the color is a preset color
- the method further includes: performing edge removal processing on the plurality of slice images.
- adding a preset border along the edge of the original image to a side away from the center of the original image to obtain the initial image may include: determining the edge of the original image; generating a preset border based on the edge of the original image, wherein the preset border including a first edge and a second edge; a preset border is set on a side of the edge of the original image away from the center of the original image to obtain an initial image, wherein the first edge of the preset border overlaps with the edge of the original image,
- the second edge of the preset frame is separated from the edge of the original image by a second pixel width, and the edge of the original image is the second edge.
- adding a preset border along the edge of the original image to a side away from the center of the original image to obtain the original image may also include: extending the edge of the original image to a side away from the center of the original image by a second pixel width to Obtain a preset frame, take the edge of the expanded original image as the second edge of the preset frame, use the edge of the original image before expansion as the first edge of the preset frame, and fill the preset frame with the preset color, thereby obtaining Initial image.
- step S20 when performing the slicing process described in step S20, slicing is performed based on the original image with a preset border, so that the sliced image that overlaps with the edge of the original image still retains the preset border, so that the preset border can be used according to the preset border. Border performs edge cleaning on the sliced image.
- each slice image includes four slice edges
- performing edge removal processing on the plurality of slice images may include: in response to detecting that the ith slice edge of each slice image is a part of the second edge, processing the ith slice The edge is subjected to edge removal processing. In response to detecting that the ith slice edge of each slice image is not part of the second edge, the ith slice edge is not subjected to edge removal processing, where i is a positive integer and less than or equal to 4.
- the model is trained based on a training image with a preset frame, and when an edge of the training image with a preset frame is detected, the edge of the image is edge-cleaned; if an edge of the training image does not have a preset frame With a preset border, the edge of the image is not edge-cleaned to avoid loss of non-edge details.
- the method of edge cleaning can be performed in any feasible manner, and the present disclosure does not limit the process of edge cleaning.
- the edge region of the composite image can be determined first, and the edge region of the composite image can be traversed by performing row scanning and column scanning on the initial image to determine whether there is an area to be cleaned whose size exceeds a preset threshold; At least one area to be cleaned with a threshold is set, and the pixel value of the pixel corresponding to the at least one area to be cleaned is set as a preset pixel value.
- FIG. 7D is a schematic diagram of an initial image provided by an embodiment of the present disclosure.
- the dotted frame is the edge of the original image, and a preset border with a preset color and a preset width is added based on the edge of the original image.
- the preset color can be red
- the preset width can be set as required , for example, any value within the range of 20 pixels-50 pixels; the first edge of the preset frame overlaps with the edge of the original image, that is, the dotted frame in the figure, and the second edge of the preset frame is the point in Figure 7C Lined box with a second edge separated from the edge of the original image by a second pixel width.
- FIG. 7E shows a schematic diagram of a slice image corresponding to the initial image shown in FIG. 7D .
- the initial image shown in Figure 7D is divided into slice images (1) and slice images (2) with the horizontal central axis as the boundary, and the dashed box shows the slice edges of slice images (1) and slice images (2).
- the slice edges except the lower slice edge in the slice image (1) are all part of the second edge, so the edge removal process is not performed on the lower slice edge in the slice image (1) during the image processing process.
- the upper slice edge, left slice edge and right slice edge of slice image (1) are edge-cleared; the slice edges except the upper slice edge in slice image (2) are all part of the second edge, so In the image processing process, edge removal processing is not performed on the upper slice edge in the slice image (2), and edge removal processing is performed on the lower slice edge, left slice edge and right slice edge of the slice image (2).
- FIG. 7F shows a schematic diagram of a processed slice image corresponding to the slice image shown in FIG. 7E .
- the image (2) is subjected to tone processing, thereby obtaining the processed slice image (1) and the processed slice image (2) as shown in FIG. 7F.
- the processed slice image (1) corresponds to the slice image (1)
- the image (2) corresponds to the sliced image (2), and the upper edge image in the sliced image (2) is preserved after processing, and is not cleaned up as other images that do not belong to the image content.
- a border mark can be added to the original image, for example, a preset border is used to indicate the edge of the original image, so that only the edge of the sliced image with the preset border can be removed. Edge cleanup to avoid loss of non-edge details.
- FIGS. 8A to 8C show schematic diagrams of an image processing process for executing the image processing method provided by an embodiment of the present disclosure.
- the specific execution process of the image processing method provided by at least one embodiment of the present disclosure will be described in detail below with reference to FIGS. 8A to 8C .
- FIG. 8A is a schematic diagram of an initial image provided by an embodiment of the present disclosure.
- the initial image shown in FIG. 8A is obtained based on the original image shown in FIG. 2 .
- the dotted frame is the edge of the original image.
- a preset border with a preset color and a preset width is added.
- the first edge of the preset border overlaps with the edge of the original image, that is, In the dotted box in the figure, the second edge of the preset frame is the dot-dash box in FIG. 8A , and the second edge is separated from the edge of the original image by a second pixel width.
- the slicing process in step S20 may be performed on the initial image shown in FIG. 8A to obtain a plurality of slicing images, wherein the slicing images are determined based on the initial image including a preset border, and the size of each slicing image is the slice size , the specific process is as described above and will not be repeated here.
- edge cleaning is performed on the sliced image. The process of edge cleaning is as described above, and will not be repeated here.
- tones are processed on the sliced image after edge cleaning, and the specific process is as described in step S30, which will not be repeated here.
- FIG. 8B shows a schematic diagram of the slice image to be processed corresponding to the initial image shown in FIG. 8A .
- the slice images to be processed in the figure are three slice images to be processed respectively corresponding to three slice images in the plurality of slice images, for example, the three slice images to be processed are respectively the slice images to be processed (1 ), the slice image to be processed (2), and the slice image to be processed (3), wherein the slice image to be processed (2) and the slice image to be processed (3) are defined by the dotted line frame respectively corresponding to the area
- the reference area, the method for determining the reference area and the tone processing process are as described above, and will not be repeated here.
- FIG. 8C shows a schematic diagram of a processed slice image corresponding to the slice image to be processed shown in FIG. 8B .
- the above-mentioned edge removal processing and tone processing are performed on the slice image to be processed shown in FIG. 8B to obtain the processed slice image (1) to the processed slice image (3) as shown in FIG. 8C (in FIG. 8C ). dashed box).
- the processed slice image (1) corresponds to the to-be-processed slice image (1)
- the processed slice image (2) corresponds to the to-be-processed slice image (2)
- the processed slice image (3) corresponds to the to-be-processed slice image (3).
- the edge images in the processed slice image (1), the processed slice image (2), and the processed slice image (3) are preserved.
- step S40 which is not repeated here.
- FIG. 6F shows a schematic diagram of the processed image corresponding to the original image shown in FIG. 2 .
- a frame removal process may be performed on the composite image to remove the preset frame, thereby obtaining a processed image corresponding to the original image shown in FIG. 2 as shown in FIG. 6F .
- FIG. 9 is a schematic block diagram of an image processing apparatus provided by at least one embodiment of the present disclosure.
- the image processing apparatus 900 may include: an acquisition unit 901 , a slice processing unit 902 , a tone processing unit 903 and a synthesis unit 904 .
- these modules may be implemented by hardware (eg, circuit) modules, software modules, or any combination of the two, and the following embodiments are the same, and will not be described again.
- it may be implemented by a central processing unit (CPU), graphics processing unit (GPU), tensor processing unit (TPU), field programmable gate array (FPGA), or other form of data processing capability and/or instruction execution capability.
- CPU central processing unit
- GPU graphics processing unit
- TPU tensor processing unit
- FPGA field programmable gate array
- the acquisition unit 901 is configured to acquire an initial image.
- the slice processing unit 902 is configured to perform slice processing on the initial image to obtain a plurality of slice images corresponding to the initial image.
- the tone processing unit 903 is configured to perform tone processing on the plurality of slice images to obtain a plurality of processed slice images, wherein the tones of the plurality of processed slice images are kept consistent.
- the synthesizing unit 904 is configured to perform stitching processing on the plurality of processed slice images according to the positional relationship of the plurality of slice images in the initial image, so as to obtain a composite image.
- the acquisition unit 901, the slice processing unit 902, the tone processing unit 903, and the synthesis unit 904 may include codes and programs stored in memory; the codes and programs may be executed by the processor to implement the acquisition unit 901, the slice processing as described above Some or all of the functions of the unit 902 , the tone processing unit 903 and the synthesis unit 904 .
- the acquiring unit 901, the slicing processing unit 902, the tone processing unit 903 and the synthesizing unit 904 may be dedicated hardware devices for realizing the above-mentioned acquisition unit 901, the slicing processing unit 902, the tone processing unit 903 and the synthesizing unit 904. some or all functions.
- the acquiring unit 901, the slicing processing unit 902, the tone processing unit 903, and the synthesizing unit 904 may be one circuit board or a combination of a plurality of circuit boards for implementing the functions as described above.
- the one circuit board or the combination of multiple circuit boards may include: (1) one or more processors; (2) one or more non-transitory memories connected to the processors; and (3) The firmware stored in the memory executable by the processor.
- the acquiring unit 901 can be used to implement the step S10 shown in FIG. 1
- the slice processing unit 902 can be used to implement the step S20 shown in FIG. 1
- the tone processing unit 903 can be used to implement the step shown in FIG. 1
- the synthesis unit 904 may be used to implement step S40 shown in FIG. 1 . Therefore, for the specific description of the functions that the acquisition unit 901, the slice processing unit 902, the tone processing unit 903 and the synthesizing unit 904 can implement, reference may be made to the relevant descriptions of steps S10 to S40 in the embodiments of the above image processing method, and the repetitions are not repeated. Repeat.
- the image processing apparatus 900 can achieve technical effects similar to those of the aforementioned image processing methods, and details are not described herein again.
- the image processing apparatus 900 may include more or less circuits or units, and the connection relationship between the various circuits or units is not limited and may be determined according to actual needs .
- the specific structure of each circuit or unit is not limited, and can be composed of analog devices, digital chips, or other suitable ways according to circuit principles.
- FIG. 10 is a schematic diagram of an electronic device provided by at least one embodiment of the present disclosure.
- the electronic device includes a processor 1001 , a communication interface 1002 , a memory 1003 and a communication bus 1004 .
- the processor 1001, the communication interface 1002, and the memory 1003 communicate with each other through the communication bus 1004, and the components such as the processor 1001, the communication interface 1002, and the memory 1003 can also communicate through a network connection.
- the present disclosure does not limit the type and function of the network. It should be noted that the components of the electronic device shown in FIG. 10 are only exemplary and not restrictive, and the electronic device may also have other components according to actual application requirements.
- memory 1003 is used for non-transitory storage of computer readable instructions.
- the processor 1001 is configured to execute the computer-readable instructions, the image processing method according to any one of the foregoing embodiments is implemented.
- the image processing method according to any one of the foregoing embodiments is implemented.
- the communication bus 1004 may be a Peripheral Component Interconnect Standard (PCI) bus or an Extended Industry Standard Architecture (EISA) bus, or the like.
- PCI Peripheral Component Interconnect Standard
- EISA Extended Industry Standard Architecture
- the communication bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus.
- the communication interface 1002 is used to enable communication between the electronic device and other devices.
- the processor 1001 and the memory 1003 may be provided on the server side (or the cloud).
- the processor 1001 may control other components in the electronic device to perform desired functions.
- the processor 1001 may be a central processing unit (CPU), a network processing unit (NP), a tensor processing unit (TPU), a graphics processing unit (GPU), or other devices with data processing capabilities and/or program execution capabilities; it may also be Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
- the central processing unit (CPU) can be an X86 or an ARM architecture or the like.
- Memory 1003 may include any combination of one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory.
- Volatile memory may include, for example, random access memory (RAM) and/or cache memory, among others.
- Non-volatile memory may include, for example, read only memory (ROM), hard disk, erasable programmable read only memory (EPROM), portable compact disk read only memory (CD-ROM), USB memory, flash memory, and the like.
- ROM read only memory
- EPROM erasable programmable read only memory
- CD-ROM portable compact disk read only memory
- USB memory flash memory
- flash memory flash memory
- the electronic device may also include an image capture component.
- the image acquisition component is used to acquire images.
- the memory 1003 is also used to store acquired images.
- the image acquisition component may be a smartphone camera, a tablet camera, a personal computer camera, a digital camera lens, or even a web camera.
- FIG. 11 is a schematic diagram of a non-transitory computer-readable storage medium provided by at least one embodiment of the present disclosure.
- the storage medium 1100 may be a non-transitory computer-readable storage medium on which one or more computer-readable instructions 1101 may be stored non-transitory.
- the computer readable instructions 1101 may perform one or more steps of the image processing method according to the above when executed by a processor.
- the storage medium 1100 may be applied to the above-mentioned electronic device, for example, the storage medium 1100 may include a memory in the electronic device.
- the storage medium may include a memory card of a smartphone, a storage component of a tablet computer, a hard disk of a personal computer, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), A portable compact disk read only memory (CD-ROM), flash memory, or any combination of the above storage media, may also be other suitable storage media.
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable read only memory
- CD-ROM compact disk read only memory
- flash memory or any combination of the above storage media, may also be other suitable storage media.
- FIG. 12 shows a schematic diagram of a hardware environment provided by at least one embodiment of the present disclosure.
- the electronic device provided by the present disclosure can be applied to the Internet system.
- the functions of the image processing apparatus and/or electronic device involved in the present disclosure can be realized by using the computer system provided in FIG. 12 .
- Such computer systems may include personal computers, laptops, tablets, cell phones, personal digital assistants, smart glasses, smart watches, smart rings, smart helmets, and any smart portable or wearable device.
- the specific system in this embodiment illustrates a hardware platform including a user interface using functional block diagrams.
- Such computer equipment may be a general purpose computer equipment or a special purpose computer equipment. Both computer devices can be used to implement the image processing apparatus and/or electronic device in this embodiment.
- the computer system may include any component that implements the information required to implement the image processing currently described.
- a computer system can be implemented by a computer device through its hardware devices, software programs, firmware, and combinations thereof.
- FIG. 12 only one computer device is drawn in FIG. 12, but the related computer functions described in this embodiment to realize the information required for image processing can be implemented in a distributed manner by a group of similar platforms, Distribute the processing load of a computer system
- the computer system can include a communication port 250, which is connected to a network for realizing data communication.
- the computer system can send and receive information and data through the communication port 250, that is, the communication port 250 can realize the communication between the computer system and the computer system.
- Other electronic devices communicate wirelessly or by wire to exchange data.
- the computer system may also include a processor group 220 (ie, the processors described above) for executing program instructions.
- the processor group 220 may consist of at least one processor (eg, a CPU).
- the computer system may include an internal communication bus 210 .
- the computer system may include various forms of program storage units as well as data storage units (ie, the memories or storage media described above), such as hard disk 270, read only memory (ROM) 230, random access memory (RAM) 240, capable of storing Various data files used for computer processing and/or communication, and possibly program instructions executed by the processor group 220 .
- the computer system may also include an input/output component 260 for enabling input/output data flow between the computer system and other components (eg, user interface 280, etc.).
- input devices including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a liquid crystal display (LCD), speakers, vibrators, etc. output device; including storage devices such as tapes, hard disks, etc.; and a communication interface.
- input devices including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.
- LCD liquid crystal display
- speakers vibrators
- storage devices such as tapes, hard disks, etc.
- communication interface such as tapes, hard disks, etc.
- Figure 12 shows a computer system with various devices, it should be understood that the computer system is not required to have all of the devices shown, and may instead have more or fewer devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
一种图像处理方法、图像处理装置、电子设备和存储介质。该图像处理方法包括:获取初始图像;对初始图像进行切片处理,以得到初始图像对应的多个切片图像;对多个切片图像进行色调处理,以得到多个处理后切片图像,其中,多个处理后切片图像的色调保持一致;根据多个切片图像在初始图像中的位置关系,对多个处理后切片图像进行拼合处理,以得到合成图像。该图像处理方法分别对多个切片图像进行色调处理,以使得多个处理后切片图像的色调保持一致,从而使得最终得到的合成图像为统一色调的图像。
Description
本公开的实施例涉及一种图像处理方法、图像处理装置、电子设备和非瞬时性计算机可读存储介质。
在数字图像处理中,可以通过色彩还原处理将获取的数字图像还原成具有真实色彩的图像,例如,数字图像可以是彩色图像,经过色彩还原处理后的图像具有真实颜色,例如,数字图像还可以是黑白图像,经过色彩还原处理后的图像为与黑白图像对应的具有真实色彩的图像。
发明内容
本公开至少一实施例提供一种图像处理方法,包括:获取初始图像;对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像;对所述多个切片图像进行色调处理,以得到多个处理后切片图像,其中,所述多个处理后切片图像的色调保持一致;根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像。
优选地,在本公开至少一实施例提供的图像处理方法中,对所述多个切片图像进行色调处理,以得到多个处理后切片图像,包括:确定处理顺序;基于所述处理顺序,确定所述多个切片图像中的基准切片图像,其中,所述基准切片图像为基于所述处理顺序确定的第一个被执行色调处理的切片图像;对所述基准切片图像进行色调处理,以得到所述基准切片图像对应的处理后基准切片图像;将所述处理后基准切片图像的色调作为基准色调;基于所述基准色调,对所述多个切片图像中除所述基准切片图像外的所有切片图像进行色调处理,以得到所述所有切片图像分别对应的处理后切片图像,其中,所述所有切片图像分别对应的处理后切片图像的色调与所述基准色调保持一致,所述多个处理后切片图像包括所述处理后基准切片图像和所述所有切片图像分别对应的处理后切片图像。
优选地,在本公开至少一实施例提供的图像处理方法中,基于所述基准色调,对所述多个切片图像中除所述基准切片图像外的所有切片图像进行色调处理,以得到所述所有切片图像分别对应的处理后切片图像,包括:对于所述所有切片图像中的第i个切片图像:确定所述第i个切片图像对应的参考区域,其中,所述参考区域用于为所述第i个切片图像进行色调处理时提供色调参考,所述参考区域的色调与所述基准色调保持一致;基于所述参考区域,对所述第i个切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,其中,i为正整数。
优选地,在本公开至少一实施例提供的图像处理方法中,确定所述第i个切片图像对应的参考区域,包括:确定所述多个切片图像中与所述第i个切片图像对应的参考切片图像,其中,所述参考切片图像中与所述第i个切片图像重叠的区域为重叠区域,且在所述第i个切片图像进行色调处理前,所述参考切片图像已被执行色调处理;获取所述参考切片图像对应的处理后参考切片图像;基于所述重叠区域和所述处理后参考切片图像,确定处理后重叠区域,其中,所述处理后重叠区域为所述处理后参考切片图像中与所述重叠区 域所对应的区域;确定所述处理后重叠区域中的至少部分区域为所述参考区域。
优选地,在本公开至少一实施例提供的图像处理方法中,确定所述处理后重叠区域中的至少部分区域为所述参考区域,包括:响应于所述处理后重叠区域在延伸方向上的宽度等于第一像素宽度,将所述处理后重叠区域作为所述参考区域;响应于所述处理后重叠区域在所述延伸方向上的宽度大于所述第一像素宽度,选择所述处理后重叠区域中的部分区域作为所述参考区域,其中,所述部分区域在所述排列方向上的宽度为所述第一像素宽度;其中,所述延伸方向为所述处理后重叠区域的中心和所述处理后参考切片图像中除了所述处理后重叠区域之外的区域的中心的连线方向。
优选地,在本公开至少一实施例提供的图像处理方法中,基于所述参考区域,对所述第i个切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,包括:确定在所述第i个切片图像中与所述参考区域对应的区域为第一区域;确定所述第i个切片图像中除所述第一区域以外的区域为第二区域;将所述第二区域和所述参考区域进行拼合处理以得到所述第i个切片图像对应的待处理切片图像,其中,所述待处理切片图像的尺寸与所述第i个切片图像的尺寸相同;对所述待处理切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像。
优选地,在本公开至少一实施例提供的图像处理方法中,对所述待处理切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,包括:基于所述待处理切片图像中的所述参考区域,对所述待处理切片图像中的所述第二区域进行色调处理,以得到与所述第二区域对应的处理后区域,其中,所述参考区域用于为所述第二区域的色调处理提供色调参考,所述处理后区域的色调与所述参考区域的色调保持一致;基于所述参考区域和所述处理后区域,得到所述第i个切片图像对应的处理后切片图像。
优选地,在本公开至少一实施例提供的图像处理方法中,对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像,包括:确定切片尺寸;根据所述切片尺寸,将所述初始图像划分为多个切片区域,其中,每个切片区域和与所述每个切片区域相邻的所有切片区域之间均至少部分重叠;根据所述多个切片区域,对所述初始图像进行切片处理,以得到所述多个切片图像,其中,所述多个切片图像与所述多个切片区域一一对应,每个切片图像包括所述多个切片区域中的一个切片区域。
优选地,在本公开至少一实施例提供的图像处理方法中,所述合成图像中的所有像素排列为n行m列,根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像,包括:根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到中间合成图像,其中,所述中间合成图像中的所有像素排列为n行m列;在所述中间合成图像中的第t1行第t2列仅包括一个像素的情况下,将位于第t1行第t2列的所述一个像素的像素值作为所述合成图像中的第t1行第t2列的像素的像素值;在所述中间合成图像中的第t1行第t2列包括多个像素的情况下,从所述多个像素中选择任意一个像素的像素值作为所述合成图像中的第t1行第t2列的像素的像素值,其中,n、m、t1、t2均为正整数,且t1小于等于n,t2小于等于m。
优选地,在本公开至少一实施例提供的图像处理方法中,获取初始图像 包括:获取原始图像;沿所述原始图像的边缘向远离所述原始图像的中心的一侧添加预设边框,以得到所述初始图像,其中,所述初始图像包括所述原始图像和所述预设边框,所述预设边框的颜色为预设颜色;并且对所述多个切片图像进行色调处理前,所述方法还包括:对所述多个切片图像进行边缘清除处理。
优选地,在本公开至少一实施例提供的图像处理方法中,沿所述原始图像的边缘向远离所述原始图像的中心的一侧添加预设边框,以得到所述初始图像,包括:确定所述原始图像的边缘;基于所述原始图像的边缘生成所述预设边框,其中,所述预设边框包括第一边缘和第二边缘;将所述预设边框设置于所述原始图像的边缘的远离所述原始图像的中心的一侧,以得到所述初始图像,其中,所述预设边框的第一边缘与所述原始图像的边缘重叠,所述预设边框的第二边缘与所述原始图像的边缘相隔第二像素宽度,所述初始图像的边缘为所述第二边缘。
优选地,在本公开至少一实施例提供的图像处理方法中,每个切片图像包括四条切片边缘,对所述多个切片图像进行边缘清除处理,包括:响应于检测到所述每个切片图像的第i条切片边缘为所述第二边缘的一部分,对所述第i条切片边缘进行边缘清除处理,响应于检测到所述每个切片图像的第i条切片边缘不为所述第二边缘的一部分,不对所述第i条切片边缘进行边缘清除处理,其中,i为正整数且小于等于4。
优选地,在本公开至少一实施例提供的图像处理方法中,所述合成图像为彩色图像。
本公开至少一实施例提供一种图像处理装置,包括:获取单元,配置为获取初始图像;切片处理单元,配置为对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像;色调处理单元,配置为对所述多个切片图像进行色调处理,以得到多个处理后切片图像,其中,所述多个处理后切片图像的色调保持一致;合成单元,配置为根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像。
本公开至少一实施例提供一种电子设备,包括:存储器,非瞬时性地存储有计算机可执行指令;处理器,配置为运行所述计算机可执行指令,其中,所述计算机可执行指令被所述处理器运行时实现根据本公开任一实施例所述的图像处理方法。
本公开至少一实施例提供一种非瞬时性计算机可读存储介质,其中,所述非瞬时性计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现根据本公开任一实施例所述的图像处理方法。
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述中的附图仅仅涉及本公开的一些实施例,而非对本公开的限制。
图1为本公开至少一实施例提供的一种图像处理方法的示意性流程图;
图2为本公开至少一实施例提供的初始图像的示意图;
图3A至图3E为本公开至少一实施例提供的一种图像切片处理过程的示意图;
图4A为图1所示的图像处理方法中步骤S30的示意性流程图;
图4B为图4A所示的图像处理方法中步骤S305的示意性流程图;
图4C为本公开一实施例提供的切片图像A和切片图像B的示意图;
图4D为本公开一实施例提供的切片图像B和切片图像C的示意图;
图4E为本公开一实施例提供的待处理切片图像的示意图;
图5A-图5D为本公开至少一实施例提供的待处理切片图像以及对应的处理后切片图像的示意图;
图6A为本公开至少一实施例提供的中间合成图像的示意图;
图6B-图6E为本公开至少一实施例提供的合成图像的处理过程示意图;
图6F为本公开一实施例提供的一种合成图像的示意图;
图7A为本公开一实施例提供的初始图像的示意图;
图7B为本公开一实施例提供的处理后图像的示意图;
图7C为本公开一实施例提供的切片图像的示意图;
图7D-图7F为本公开一实施例提供的包含边缘清除处理的图像处理过程示意图;
图8A-图8C为本公开一实施例提供的图像处理过程示意图
图9为本公开至少一实施例提供的一种图像处理装置的示意性框图;
图10为本公开至少一实施例提供的一种电子设备的示意图;
图11为本公开至少一实施例提供的一种非瞬时性计算机可读存储介质的示意图;
图12为本公开至少一实施例提供的一种硬件环境的示意图。
为了使得本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员在无需创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外定义,本公开使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。为了保持本公开实施例的以下说明清楚且简明,本公开省略了部分已知功能和已知部件的详细说明。
在对获取的数字图像还原或处理成具有真实色彩的彩色图像过程中,可以采用色彩还原模型进行处理,例如通过色彩还原模型将彩色图像处理为具有真实色彩的图像,或者将黑白图像处理为彩色图像。由于色彩还原模型需要占据比较大的内存,从而会降低色彩还原处理速率,或者,在一些示例中,由移动终端对图像进行处理,此时色彩还原模型所占据的内存可能会超过移 动处终端的内存而无法进行图像处理。通常可以采用图像切片的方式,将原始图像切分成多个切片图像,分别对每片切片图像进行色彩还原处理,以降低在色彩还原处理过程中所占据的内存资源。但在这种场景下,由于色彩还原模型无法得到数字图像的全局信息,会产生执行色彩还原处理后的切片图像之间的色调不统一的问题。
本公开至少一实施例提供一种图像处理方法、图像处理装置、电子设备和非瞬时性计算机可读存储介质,该图像处理方法包括:获取初始图像;对初始图像进行切片处理,以得到初始图像对应的多个切片图像;对多个切片图像进行色调处理,以得到多个处理后切片图像,其中,多个处理后切片图像的色调保持一致;根据多个切片图像在初始图像中的位置关系,对多个处理后切片图像进行拼合处理,以得到合成图像。
该图像处理方法分别对多个切片图像进行色调处理,以使得多个处理后切片图像的色调保持一致,从而使得最终得到的合成图像为统一色调的图像。
本公开实施例提供的图像处理方法可以应用在移动终端(例如,手机、平板电脑等)中,在提高处理速度的基础上,保持处理后得到的多个图像的色调统一,还可以实现对移动终端采集的图像进行实时的色彩还原处理。
需要说明的是,本公开实施例提供的图像处理方法可应用于本公开实施例提供的图像处理装置,该图像处理装置可被配置于电子设备上。该电子设备可以是个人计算机、移动终端等,该移动终端可以是手机、平板电脑等具有各种操作系统的硬件设备。
下面结合附图对本公开的实施例进行详细说明,但是本公开并不限于这些具体的实施例。
图1为本公开至少一实施例提供的一种图像处理方法的示意性流程图。图2为本公开至少一实施例提供的一种初始图像的示意图。
如图1所示,本公开至少一实施例提供的图像处理方法包括步骤S10至步骤S40。
步骤S10,获取初始图像。
步骤S20,对初始图像进行切片处理,以得到初始图像对应的多个切片图像。
步骤S30,对多个切片图像进行色调处理,以得到多个处理后切片图像,其中,多个处理后切片图像的色调保持一致。
步骤S40,根据多个切片图像在初始图像中的位置关系,对多个处理后切片图像进行拼合处理,以得到合成图像。
例如,在步骤S10中获取的初始图像可以包括至少一个对象,对象可以为字符,字符可以包括中文(例如,汉字或拼音)、英文、日文、法文、韩文、拉丁文、数字等,此外,对象还可以包括各种符号(例如,大于符号、小于符号、百分号等)、各种图形和图像(手画或印刷)等。至少一个对象可以包括印刷或者机器输入的字符,也可以包括手写字符。例如,如图2所示,在一些实施例中,初始图像中的对象可以包括印刷体的单词和字母(例如,英文、日文、法文、韩文、德文、拉丁文等不同国家的语言和文字)、印刷体的数字(例如,日期、重量、尺寸等)、印刷体的符号和图像等、手写的单词和字母、手写的数字、手写的符号和图形等。
例如,初始图像可以为各种类型的图像,例如,可以为购物清单的图像、餐饮小票的图像、试卷的图像、合同的图像、绘画的图像等。如图2所示, 初始图像可以为信件的图像。
例如,初始图像的形状可以为矩形等。初始图像的形状和尺寸等可以由用户根据实际情况自行设定。
例如,初始图像可以为通过图像采集装置(例如,数码相机或手机等)拍摄的图像,初始图像可以为灰度图像,也可以为彩色图像。需要说明的是,初始图像是指以可视化方式呈现物体的形式,例如图片等。又例如,初始图像也可以通过扫描等方式得到。例如,初始图像可以为图像采集装置直接采集到的图像,也可以是对采集得到的图像进行预处理之后获得的图像。例如,为了避免初始图像的数据质量、数据不均衡等对图像处理的影响,在处理初始图像前,本公开的至少一实施例提供的图像处理方法还可以包括对初始图像进行预处理的操作。预处理例如可以包括对图像采集装置直接采集到的图像进行剪裁、伽玛(Gamma)校正或降噪滤波等处理。预处理可以消除初始图像中的无关信息或噪声信息,以便于后续更好地对初始图像进行图像处理。
例如,在一些实施例中,步骤S20可以包括:确定切片尺寸;根据切片尺寸,将初始图像划分为多个切片区域,其中,每个切片区域和与每个切片区域相邻的所有切片区域之间均至少部分重叠;根据多个切片区域,对初始图像进行切片处理,以得到多个切片图像,其中,多个切片图像与多个切片区域一一对应,每个切片图像包括多个切片区域中的一个切片区域。
例如,切片尺寸为实际切片尺寸,也即初始图像经过切片处理后的切片图像的尺寸。例如,切片尺寸对应的宽度和高度可以相等,或者,切片尺寸对应的宽度和高度也可以不相等。例如,在本公开至少一实施例中,切片尺寸可以为576×576,本公开不限于此,切片尺寸可以根据实际情况设置。
需要说明的是,在本公开的实施例中,尺寸的单位为像素,也就是说,例如,切片尺寸为576×576,其表示切片尺寸为576像素×576像素。
例如,这里相邻可以包括上侧相邻、左侧相邻、右侧相邻及下侧相邻,也就是说,与某个切片区域相邻的切片区域包括与该某个切片区域上侧相邻的切片区域(即此切片区域位于该某个切片区域的上侧)、与该切片区域左侧相邻的切片区域(即此切片区域位于该某个切片区域的左侧)、与该切片区域右侧相邻的切片区域(即此切片区域位于该某个切片区域的右侧)和与该切片区域下侧相邻的切片区域(即此切片区域位于该某个切片区域的下侧)。
例如,相邻两个切片区域之间的重叠的部分在第一方向上至少具有第一像素宽度,例如,第一方向为相邻两个切片区域的中心的连线方向。例如,第一像素宽度可以根据需要设置,例如第一像素宽度可以为20像素-50像素范围内的任意值,例如为32像素。
例如,可以以初始图像的任意一个顶点及该顶点所在的两条边缘为起始,基于切片尺寸,依次确定多个切片区域。
图3A至图3E为本公开至少一实施例提供的一种图像切片处理过程的示意图。如图3A至图3E所示,在一些实施例中,初始图像的尺寸为1660×1660,切片尺寸为576×576,第一像素宽度为32像素。
例如,如图3A所示,首先基于切片尺寸确定切片区域A,切片区域A为以该初始图像的左上角为其一个顶点,以初始图像的左侧边和上侧边为边界,基于切片尺寸所确定区域。
之后,基于切片区域A确定切片区域B,例如,切片区域A的中心和切片区域B的中心的连线方向为水平方向,切片区域A的中心和切片区域B的 中心之间的连线与初始图像的上侧边平行。切片区域B的上侧边与初始图像的上侧边重叠,切片区域B的左侧边与切片区域A的右侧边构成重叠区域,该重叠区域如图3A中阴影部分P1所示,该重叠区域P1在水平方向上的宽度为32像素,切片区域B的尺寸也为576×576。
之后,如图3B所示,基于切片区域B确定切片区域C。由于初始图像的宽度尺寸为1660像素,切片区域B的右侧边与初始图像的右侧边之间的距离为540(1660-(576+576-32))像素,若以切片区域B的右侧边和切片区域C的左侧边构成在水平方向具有32像素宽度的重叠区域来确定切片区域C,则会使得切片区域C超过初始图像的范围。因而,此时可以基于初始图像的右上角顶点确定切片区域C,例如,切片区域C为以该初始图像的右上角顶点为其一个顶点,以初始图像的右侧边和上侧边为边界,基于切片尺寸所确定的区域。此时,切片区域B的右侧边与切片区域C的左侧边构成的重叠区域在水平方向上的宽度为36像素,该重叠区域如图3B中阴影部分P2所示。
切片区域B的中心和切片区域C的中心的连线方向为水平方向,切片区域B的中心和切片区域C的中心之间的连线与初始图像的上侧边平行,即切片区域A的中心、切片区域B的中心和切片区域C的中心在同一条直线上。
之后,如图3C所示,基于切片区域A确定切片区域D,例如,切片区域A的中心和切片区域D的中心的连线方向为竖直方向,切片区域A的中心和切片区域D的中心之间的连线与初始图像的右侧边平行。切片区域D的左侧边与初始图像的左侧边重叠,切片区域D的上侧边与切片区域A的下侧边构成重叠区域,该重叠区域如图3C中阴影部分P3所示,该重叠区域P3在竖直方向上的宽度为32像素。切片区域D的尺寸也为576×576。
例如,水平方向和竖直方向彼此垂直。
例如,如图3A和3C所示,重叠区域P1在竖直方向上的宽度为576像素,重叠区域P3在水平方向上的宽度为576像素,重叠区域P1和重叠区域P3彼此部分重叠,重叠区域P1和重叠区域P3彼此重叠部分的尺寸为32×32。
之后,如图3D所示,基于切片区域D和切片区域B确定切片区域E,例如,切片区域D的中心和切片区域E的中心的连线方向为水平方向,切片区域D的中心和切片区域E的中心之间的连线与初始图像的上侧边平行,例如,切片区域B的中心和切片区域E的中心的连线方向为竖直方向,切片区域B的中心和切片区域E的中心之间的连线与初始图像的左侧边平行。切片区域E的左侧边与切片区域D的右侧边构成重叠区域,该重叠区域如图3D中阴影部分P4所示,该重叠区域P4在水平方向上的宽度为32像素,该重叠区域P4在竖直方向上的宽度为576像素;切片区域E的上侧边与切片区域B的下侧边构成重叠区域,该重叠区域如图3D中阴影部分P5所示,该重叠区域P5在竖直方向上的宽度为32像素,该重叠区域P5在水平方向上的宽度为576像素,切片区域E的尺寸也为576×576。
以此类推,根据上述方式依次确定切片区域F至切片区域I,具体过程不再赘述,最终得到如图3E所示的所有切片区域的示意图。如图3E所示,每个切片区域的尺寸为576×576,每个切片区域和与每个切片区域相邻的所有切片区域之间均至少有重叠区域,重叠区域在延伸方向上的宽度至少为32像素。与切片区域E相邻的切片区域包括切片区域B、切片区域D、切片区域F和切片区域H,切片区域E和切片区域B之间具有在第一方向(即竖直方向)上宽度为32像素和在水平方向上宽度为576像素的重叠区域Q1,切片区域E 和切片区域D之间具有在第二方向上(即水平方向)宽度为32像素和在竖直方向上宽度为576像素的重叠区域Q2,切片区域E和切片区域F之间具有在第二方向上(即水平方向)上宽度为36像素和在竖直方向上宽度为576像素的重叠区域Q3,切片区域E和切片区域H之间具有在第一方向(即竖直方向)上宽度为36像素和在水平方向上宽度为576像素的重叠区域Q4。
需要说明的是,上述示例中的切片区域的确定顺序仅为示例性的顺序,本公开中的切片区域的确定顺序可以根据需要设置,不同的顺序可能产生不同的重叠区域,本公开对此不作限制。例如,可以从初始图像的右侧边开始依次确定切片区域,此时针对图3A-图3E中的初始图像,切片区域B和切片区域C之间的重叠区域在第二方向上的宽度可以为36像素,切片区域B和切片区域A之间的重叠区域在第二方向上的宽度可以为32像素。
例如,可以根据图3E所示的切片区域对初始图像执行切片,得到一一对应于切片区域A-切片区域I的切片图像A-切片图像I,每个切片图像的尺寸为576×576。
图4A为图1所示的图像处理方法中步骤S30的示意性流程图。例如,如图4A所示,图像处理方法中的步骤S30可以具体包括步骤S301-步骤S305。
步骤S301,确定处理顺序。
步骤S302,基于处理顺序,确定多个切片图像中的基准切片图像,其中,基准切片图像为基于处理顺序确定的第一个被执行色调处理的切片图像。
步骤S303,对基准切片图像进行色调处理,以得到基准切片图像对应的处理后基准切片图像。
步骤S304,将处理后基准切片图像的色调作为基准色调。
步骤S305,基于基准色调,对多个切片图像中除基准切片图像外的所有切片图像进行色调处理,以得到所有切片图像分别对应的处理后切片图像。
例如,在步骤S305中,所有切片图像分别对应的处理后切片图像的色调与基准色调保持一致,多个处理后切片图像包括处理后基准切片图像和所有切片图像分别对应的处理后切片图像。
例如,步骤S301中的处理顺序可以为预先确定的顺序,例如,可以选择任意一个切片图像作为第一个被执行色调处理的切片图像,也即基准切片图像,之后选择与基准切片图像相邻的任意一个切片图像进行处理,以此类推,确定处理顺序。
例如,对图3E所示切片区域进行切片处理得到的切片图像A-切片图像I,处理顺序可以为A-B-C-D-E-F-G-H-I,此时基准切片图像为切片图像A,将切片图像A对应的处理后切片图像的色调作为基准色调,以基于基准色调对切片图像B-切片图像I进行色调处理,使得切片图像B-切片图像I分别对应的处理后切片图像的色调与基准色调保持一致;或者,处理顺序也可以为C-B-A-D-E-F-I-H-G,此时基准切片图像为切片图像C,将切片图像C对应的处理后切片图像的色调作为基准色调,以基于基准色调对切片图像A、切片图像B、切片图像D-切片图像I进行色调处理,使得切片图像A、切片图像B、切片图像D-切片图像I分别对应的处理后切片图像的色调与基准色调保持一致。
在本公开至少一实施例提供的图像处理方法中,在分别对多个切片图像进行色调处理的过程中,首先确定基准色调,即选择处理后基准切片图像的色调作为基准色调,然后基于基准色调对其他切片图像进行色调处理,以使 得其他切片图像对应的处理后切片图像的色调与基准色调保持一致,从而使得所有处理后切片图像的色调均一致,进而使得最终得到的合成图像为统一色调的图像。
图4B为图4A所示的图像处理方法中步骤S305的示意性流程图。例如,在本公开至少一实施例中,如图4B所示,步骤S305具体可以包括步骤S3051-步骤S3052。
在步骤S3051,确定第i个切片图像对应的参考区域,其中,参考区域用于为第i个切片图像进行色调处理时提供色调参考,参考区域的色调与基准色调保持一致。
例如,在一些实施例中,步骤S3051可以包括:确定多个切片图像中与第i个切片图像对应的参考切片图像,其中,参考切片图像中与第i个切片图像重叠的区域为重叠区域,且在第i个切片图像进行色调处理前,参考切片图像已被执行色调处理;获取参考切片图像对应的处理后参考切片图像;基于重叠区域和处理后参考切片图像,确定处理后重叠区域,其中,处理后重叠区域为处理后参考切片图像中与重叠区域所对应的区域;确定处理后重叠区域中的至少部分区域为参考区域。
例如,确定处理后重叠区域中的至少部分区域为参考区域,可以包括:响应于处理后重叠区域在延伸方向上的宽度等于第一像素宽度,将处理后重叠区域作为参考区域;响应于处理后重叠区域在延伸方向上的宽度大于第一像素宽度,选择处理后重叠区域中的部分区域作为参考区域,其中,部分区域在排列方向上的宽度为第一像素宽度;例如,延伸方向为处理后重叠区域的中心和处理后参考切片图像中除了处理后重叠区域之外的区域的中心的连线方向。在这种方式中,指定参考区域为固定宽度的参考区域,从而在对切片图像进行色调处理时,保持处理参数的统一性,提高图像处理效率。
例如,参考切片图像需要满足两个条件:第一,参考切片图像对应的切片区域需要与待处理的第i个切片图像对应的切片区域至少部分重叠,第二,在第i个切片图像进行色调处理前,参考切片图像已被执行色调处理。例如,当处理顺序为A-B-C-D-E-F-G-H-I时,基准切片图像为切片图像A,对于切片图像B,其参考切片图像可以为切片图像A;对于切片图像C,其参考切片图像可以为切片图像B;对于切片图像D,其参考切片图像可以为切片图像A;对于切片图像E,其参考切片图像可以为切片图像B或切片图像D;对于切片图像F,其参考切片图像可以为切片图像E或切片图像C,以此类推。
图4C为本公开一实施例提供的切片图像A和切片图像B的示意图。由于在对初始图像进行切片区域划分时,切片区域A和切片区域B具有重叠的阴影区域P1(如图3A所示),因此在切片区域A对应的切片图像A和切片区域B对应的切片图像B中,阴影区域P1对应的部分中的像素具有相同的像素值,也即图4C中阴影区域O1和阴影区域O2中的相同位置的像素具有相同的像素值。
例如,当第i个切片图像为切片图像B,切片图像B的参考切片图像为切片图像A,第一像素宽度为32时,“参考切片图像中与第i个切片图像重叠的区域”可以为图4C所示的阴影区域O1,处理后重叠区域可以为与切片图像A对应的处理后切片图像A’中与阴影区域O1对应的部分,例如,由于处理后重叠区域在延伸方向上的宽度为第一像素宽度32,可以选择全部的处理后重叠区域作为切片图像B对应的参考区域。
图4D为本公开一实施例提供的切片图像B和切片图像C的示意图。例如,当第i个切片图像为切片图像C,切片图像C的参考切片图像为切片图像B,第一像素宽度为32时,“参考切片图像中与第i个切片图像重叠的区域”可以为图4D中由粗线框限定的区域O3;处理后重叠区域为与切片图像B对应的处理后切片图像B’中与区域O3对应的部分,例如,由于处理后重叠区域在延伸方向上的宽度大于第一像素宽度,可以选择部分处理后重叠区域作为切片图像C对应的参考区域,例如,可以选择以切片图像C的左侧边缘为起始,在延伸方向上宽度为第一像素宽度的区域(也即图4D中的阴影区域R2),在处理后重叠区域中对应的部分作为参考区域,也即是,以处理后重叠区域的左侧边缘为起始,在延伸方向上的宽度为第一像素宽度的区域作为参考区域,也即选择图4D中的阴影区域R1在与切片图像B对应的处理后切片图像B’中对应的部分作为参考区域。
在步骤S3052,基于参考区域,对第i个切片图像进行色调处理,以得到第i个切片图像对应的处理后切片图像,例如,i为正整数。
例如,在一些实施例中,步骤S3052可以包括:确定在第i个切片图像中与参考区域对应的区域为第一区域;确定第i个切片图像中除第一区域以外的区域为第二区域;将第二区域和参考区域进行拼合处理以得到第i个切片图像对应的待处理切片图像,其中,待处理切片图像的尺寸与第i个切片图像的尺寸相同;对待处理切片图像进行色调处理,以得到第i个切片图像对应的处理后切片图像。
例如,如图4C所示,第i个切片图像为切片图像B,参考区域为处理后切片图像A’中与阴影部分O1对应的区域,则第一区域为切片图像B中的阴影区域O2,第二区域为切片图像B中除阴影区域O2以外的部分,将第二区域和参考区域进行拼合处理以得到切片图像B对应的待处理切片图像,例如,该待处理切片图像的示意图如图4E所示,将参考区域(图4E中虚线框所限定的区域)和第二区域按照第一区域和第二区域的位置关系拼合为待处理切片图像。
例如,如图4D所示,第i个切片图像为切片图像C,参考区域为处理后切片图像B’中与阴影部分R1对应的区域(在延伸方向上的宽度为32像素),第一区域为切片图像C中的阴影区域R2,第二区域为切片图像C中除阴影区域R2以外的部分(宽度为(576-32)像素),将第二区域和参考区域进行拼合处理以得到切片图像C对应的待处理切片图像,例如,该待处理切片图像的示意图如图4E所示,将参考区域(图4E中虚线框所限定的区域)和第二区域按照第一区域和第二区域的位置关系拼合为待处理切片图像。
例如,在一些实施例中,第i个切片图像对应的参考切片图像可能存在多个,例如对于图3E所示的切片区域E对应的切片图像E,当处理顺序为A-B-C-D-E-F-G-H-I时,切片图像E的参考切片图像可以为切片图像D或切片图像B,此时,在获取切片图像E对应的待处理切片图像时,可以基于切片图像D确定第一参考区域,基于切片图像B确定第二参考区域,参考区域的确定方式如前所述,这里不再赘述;之后,将切片图像E中除对应于第一参考区域和第二参考区域以外的区域作为第二区域,对第一参考区域、第二参考区域以及第二区域进行拼合处理以得到待处理切片图像。在色调处理时,可以根据第一参考区域对待处理切片图像进行色调处理,即利用第一参考区域为待处理切片图像进行色调处理时提供色调参考;或者,也可以根据第二 参考区域对待处理切片图像进行色调处理,即利用第二参考区域为待处理切片图像进行色调处理时提供色调参考;或者,还可以根据第一参考区域和第二参考区域对待处理切片图像进行色调处理,即同时利用第一参考区域和第二参考区域为待处理切片图像进行色调处理时提供色调参考。需要说明的是,第一参考区域和第二参考区域部分重叠。
在一些实施例中,也可以在色调处理过程中提取参考切片图像中的参考区域的图像数据,将第i个切片图像中的对应部分替换为该图像数据,将替换后的切片图像作为待处理切片图像以执行色调处理,本公开对此不作限制。
例如,对待处理切片图像进行色调处理,以得到第i个切片图像对应的处理后切片图像,可以包括:基于待处理切片图像中的参考区域,对待处理切片图像中的第二区域进行色调处理,以得到与第二区域对应的处理后区域,其中,参考区域用于为第二区域的色调处理提供色调参考,处理后区域的色调与参考区域的色调保持一致;基于参考区域和处理后区域,得到第i个切片图像对应的处理后切片图像。
例如,可以将参考区域和处理后区域按照参考区域和第二区域的位置关系拼合为处理后切片图像。
例如,对待处理切片图像进行色调处理,可以通过预先训练好的色彩还原处理模型实现。例如,色彩还原处理模型可以基于神经网络模型训练得到,例如,神经网络模型可以为U-Net结构的模型,例如,可以将大量原始图像的样本和对原始图像进行色彩编辑后的效果图作为训练数据,对神经网络模型进行训练,从而建立色彩还原处理模型。
图5A-图5D为本公开至少一实施例提供的待处理切片图像以及对应的处理后切片图像的示意图。该待处理切片图像为对图2所示的初始图像执行本公开至少一实施例提供的图像处理方法所得到。
例如,对图2所示的初始图像执行步骤S20中的切片处理时,切片区域的分布可以如图3E所示,处理顺序可以为A-B-C-D-E-F-G-H-I,也即是,通过对图2所示的初始图像执行步骤S20,得到分别对应于图3E中的切片区域A-切片区域I的切片图像A-切片图像I。
例如,如图5A所示,待处理切片图像A为切片图像A,对待处理切片图像A进行色调处理得到处理后切片图像A’(图5A中右侧虚线框所示)。
例如,如图5B所示,待处理切片图像B包括参考区域和第二区域,其中,参考区域(图5B中实线框所示)为图5A中的处理后切片图像A中的部分区域,第二区域(图5B中虚线框所示)为切片图像B中的部分区域,参考区域和第二区域的确定方式和拼合方式如前所述,这里不再赘述。对该待处理切片图像B进行色调处理得到处理后切片图像B’(图5B中右侧虚线框所示)。
例如,如图5C所示,待处理切片图像C包括参考区域和第二区域,其中,参考区域(图5C中实线框所示)为图5B中的处理后切片图像B中的部分区域,第二区域(图5C中虚线框所示)为切片图像C中的部分区域,参考区域和第二区域的确定方式和拼合方式如前所述,这里不再赘述。对该待处理切片图像C进行色调处理得到处理后切片图像C’(图5C中右侧虚线框所示)。
例如,如图5D所示,待处理切片图像D包括参考区域和第二区域,其中,参考区域(图5D中实线框所示)为图5A中的处理后切片图像A中的部分区域,第二区域(图5D中虚线框所示)为切片图像D中的部分区域,参考区域和第二区域的确定方式和拼合方式如前所述,这里不再赘述。对该待 处理切片图像D进行色调处理得到处理后切片图像D’(图5D中右侧虚线框所示)。
在上述色调处理过程中,首先基于第一个被处理的切片图像确定基准色调,之后基于基准色调对其他切片图像进行处理,在每片切片图像进行色调处理时,基于每片切片图像的参考区域进行色调处理,以保证每片切片图像的色调与参考区域的色调一致,而参考区域的色调与基准色调保持一致,从而使得所有切片图像具有统一的色调。
例如,在一些实施例中,合成图像中的所有像素排列为n行m列,步骤S40可以包括:根据多个切片图像在初始图像中的位置关系,对多个处理后切片图像进行拼合处理,以得到中间合成图像,其中,中间合成图像中的所有像素排列为n行m列;在中间合成图像中的第t1行第t2列仅包括一个像素的情况下,将位于第t1行第t2列的一个像素的像素值作为合成图像中的第t1行第t2列的像素的像素值;在中间合成图像中的第t1行第t2列包括多个像素的情况下,从多个像素中选择任意一个像素的像素值作为合成图像中的第t1行第t2列的像素的像素值,这里,n、m、t1、t2均为正整数,且t1小于等于n,t2小于等于m。
例如,该合成图像可以为彩色图像,例如彩色图像中的像素的像素值可以包括一组RGB像素值,或者,该合成图像也可以为单色图像,例如,单色图像的像素的像素值可以为一个颜色通道的像素值。
图6A为本公开至少一实施例提供的中间合成图像的示意图。例如,图6A中的A’—I’分别表示对应于图3E所示的切片区域A-切片区域I的处理后切片图像A’—处理后切片图像I’,每个处理后切片图像的尺寸均为576×576,与切片区域的尺寸相同。例如,图6A中的阴影区域表示多个处理后切片图像之间由于切片区域之间的重叠而在拼合过程中形成的重合区域,例如,每个重合区域在延伸方向上的宽度与对应的重叠区域在第一方向上的宽度相同。
例如,在一些实施例中,中间合成图像中的位于第t1行第t2列的像素为图6A中的q1点,而q1点为处理后切片图像A’以及处理后切片图像B’之间的重合区域中的某个像素,此时,中间合成图像中的第t1行第t2列(即q1点)处包括2个像素,该2个像素分别为切片图像A’中的q1点处的像素和切片图像B’中的q1点处的像素。例如,可以选择2个像素中的任意一个像素的像素值作为合成图像中的位于第t1行第t2列的像素的像素值。例如,如步骤S30中所述,当切片图像B对应的参考切片图像为切片图像A时,对于处理后切片图像B’,该重合区域即为切片图像B对应的参考区域,且该参考区域来自处理后切片图像A’,因而该2个像素的像素值相同,且均为处理后切片图像A’中的q1点处的像素的像素值。
例如,在另一些实施例中,中间合成图像中的位于第t1行第t2列的像素为图6A中的q2点,而q2点为处理后切片图像A’、处理后切片图像B’、处理后切片图像D’和处理后切片图像E’之间的重合区域中的某个像素,此时,中间合成图像中的第t1行第t2列(即q2点)处包括4个像素,该4个像素分别为处理后切片图像A’中的q2点处的像素、处理后切片图像B’中的q2点处的像素、处理后切片图像D’中的q2点处的像素、处理后切片图像E’中的q2点处的像素。例如,可以选择4个像素中的任意一个像素的像素值作为合成图像中的位于第t1行第t2列的像素的像素值。例如,如步骤S30 中所述,当切片图像B对应的参考切片图像为切片图像A,切片图像D对应的参考切片图像为切片图像A,切片图像E对应的参考切片图像为切片图像D时,q2点既属于切片图像B的参考区域,也属于切片图像D、切片图像E的参考区域,因而该4个像素的像素值相同,且均为处理后切片图像A’中的q2点处的像素的像素值。
在另一些实施例中,中间合成图像中的位于第t1行第t2列的像素为图6A中的q3点,而q3点为处理后切片图像B’和处理后切片图像C’之间的重合区域中的某个像素,此时,中间合成图像中的位于第t1行第t2列(即q3点)包括2个像素,该2个像素分别为处理后切片图像B’中的q3点处的像素、处理后切片图像C’中的q3点处的像素。例如,可以选择2个像素中的任意一个像素的像素值作为合成图像中的位于第t1行第t2列的像素的像素值。
例如,在一些示例中,当切片图像C的参考切片图像为切片图像B,且切片图像C的参考区域为部分处理后重叠区域时,例如,如图4D所示,选择阴影区域R1在与切片图像B对应的处理后切片图像B’中对应的部分作为参考区域,从而在图6A中,中间合成图像中由虚线框所限定的区域的像素可能存在两个不相同的像素值,例如,为保证模型处理的统一性,可以均优先选择来自参考切片图像对应的处理后切片图像中的像素的像素值作为合成图像中的第t1行第t2列的像素的像素值,也即这里选择处理后切片图像B’中位于第t1行第t2列的像素的像素值作为合成图像中的第t1行第t2列的像素的像素值。
例如,在另一些实施例中,位于第t1行第t2列的像素为图6A中的q4点,而q4点为非重合区域中的某个像素,此时,中间合成图像中的位于第t1行第t2列(即q4点)包括一个像素。例如,将该一个像素的像素值作为合成图像中的位于第t1行第t2列的像素的像素值。
例如,在另一些实施例中,可以依次按照处理顺序分别提取多个处理后切片图像中的至少部分图像,以得到多个待拼接图像,之后,根据多个切片图像在初始图像中的位置关系,将多个待拼接图像拼接得到合成图像。例如,每个待拼接图像为对应的处理后切片图像中,除处理后重叠区域以外的部分。
例如,对应于图3E所示的切片区域A-切片区域I得到的切片图像A-切片图像I,处理顺序为A-B-C-D-E-F-G-H-I,切片图像A-切片图像I对应于处理后切片图像A’-处理后切片图像I’。
例如,切片图像A为基准切片图像,因而切片图像A对应的待拼接图像A``为完整的处理后切片图像A’。
例如,切片图像B的参考切片图像为切片图像A,因而切片图像B对应的待拼接图像B``为处理后切片图像B’中除处理后重叠区域以外的区域。例如,如图6B所示的处理后切片图像B’,处理后重叠区域为处理后参考切片图像A中与重叠区域所对应的区域,待拼接图像B``为图6B中由黑色粗线框所限定的区域,待拼接图像B``的尺寸为544×576((576-32)×576)。
例如,切片图像C的参考切片图像为切片图像B,因而切片图像C对应的待拼接图像C``为处理后切片图像C’中除处理后重叠区域以外的区域。例如,如图6C所示的处理后切片图像C’,处理后重叠区域为处理后参考切片图像B中与重叠区域所对应的区域,待拼接图像C``为图6C中由黑色粗线框所限定的区域,待拼接图像C``的尺寸为540×576((576-36)×576)。
例如,切片图像D的参考切片图像为切片图像A,因而切片图像D对应 的待拼接图像D``为处理后切片图像D’中除处理后重叠区域以外的区域。例如,如图6D所示的处理后切片图像D’,处理后重叠区域为处理后参考切片图像A中与重叠区域所对应的区域,待拼接图像D``为图6D中由黑色粗线框所限定的区域,待拼接图像D``的尺寸为576×544(576×(576-36))。
以此类推,根据上述方式依次确定待拼接图像E``至待拼接图像I``,具体过程不再赘述。
之后,根据多个切片图像在初始图像中的位置关系,将多个待拼接图像拼接得到合成图像。
图6E为本公开一实施例提供的合成图像的示意图。如图6E所示,A``—I``分别表示对应于图3E所示的切片区域A-切片区域I的待拼接图像A``—待拼接图像I``,每个待拼接图像的确定方式如前所述,每个待拼接图像的尺寸不完全相同。
图6F为本公开一实施例提供的一种合成图像的示意图,例如,图6F为对图2所示的初始图像执行本公开至少一实施例提供的图像处理方法所得到的合成图像。例如,该合成图像为彩色图像。
图7A为本公开一实施例提供的初始图像的示意图,图7B为本公开一实施例提供的处理后图像的示意图,图7C为本公开一实施例提供的切片图像的示意图。
例如,在一些实施例中,需要对待处理切片图像进行边缘清理,以将图像边缘处的不属于图像内容的其他图像清除掉。例如,如图7A所示的初始图像,该初始图像中上侧、下侧和左侧的黑色边框所限定的区域包括黑边,该黑边不属于图像内容的一部分,因此需要对初始图像进行边缘清理,以清除该部分黑边,得到如图7B所示的处理后图像,该处理后图像仅包含图像内容,而不存在初始图像中的黑边。
但是,在对初始图像进行切片后所得到的切片图像可能存在边缘清理时被误清理的情况。例如,如图7C所示的切片图像,图中由黑色虚线框所限定的区域在边缘清理的过程中很可能会被当作不属于图像内容的其他图像而被清除掉。
为解决边缘误清理的问题,可以给待处理的原始图像标记边框,得到初始图像,从而在图像处理的过程中仅对带有标记的边框进行边缘清理,对不带有标记的边框不进行边缘清理,以避免切片图像的边缘被误清理。
例如,步骤S10可以包括:获取原始图像;沿原始图像的边缘向远离原始图像的中心的一侧添加预设边框,以得到初始图像,其中,初始图像包括原始图像和预设边框,预设边框的颜色为预设颜色;并且,在对多个切片图像进行色调处理前,该方法还包括:对多个切片图像进行边缘清除处理。
例如,沿原始图像的边缘向远离原始图像的中心的一侧添加预设边框,以得到初始图像,可以包括:确定原始图像的边缘;基于原始图像的边缘生成预设边框,其中,预设边框包括第一边缘和第二边缘;将预设边框设置于原始图像的边缘的远离原始图像的中心的一侧,以得到初始图像,其中,预设边框的第一边缘与原始图像的边缘重叠,预设边框的第二边缘与原始图像的边缘相隔第二像素宽度,初始图像的边缘为第二边缘。
例如,沿原始图像的边缘向远离原始图像的中心的一侧添加预设边框,以得到初始图像,也可以包括:将原始图像的边缘向远离原始图像的中心的一侧扩展第二像素宽度以得到预设边框,将扩展后的原始图像的边缘作为预 设边框的第二边缘,将扩展前的原始图像的边缘作为预设边框的第一边缘,向预设边框填充预设颜色,从而得到初始图像。
例如,在执行步骤S20所述的切片处理时,基于带有预设边框的原始图像进行切片,从而使得与原始图像的边缘存在重叠边的切片图像仍然保留有预设边框,以根据该预设边框对切片图像执行边缘清除处理。
例如,每个切片图像包括四条切片边缘,对多个切片图像进行边缘清除处理,可以包括:响应于检测到每个切片图像的第i条切片边缘为第二边缘的一部分,对第i条切片边缘进行边缘清除处理,响应于检测到每个切片图像的第i条切片边缘不为第二边缘的一部分,不对第i条切片边缘进行边缘清除处理,其中,i为正整数且小于等于4。
例如,基于带有预设边框的训练图像对模型进行训练,在检测到训练图像的某条边缘带有预设边框时,对该条图像边缘进行边缘清除处理;在训练图像的某条边缘不带有预设边框时,不对该条图像边缘进行边缘清除处理,以避免丢失非边缘细节。
边缘清理的方式可以采用任意可行的方式执行,本公开对边缘清理的过程不作限制。例如,可以首先确定合成图像的边缘区域,通过对初始图像进行行扫描和列扫描来遍历合成图像的边缘区域,判断是否存在尺寸超过预设阈值的待清理区域;响应于边缘区域包括尺寸超过预设阈值的至少一个待清理区域,将该至少一个待清理区域对应的像素的像素值设置为预设像素值。
例如,图7D为本公开一实施例提供的初始图像的示意图。如图7D所示,虚线框为原始图像的边缘,基于原始图像的边缘添加具有预设颜色、和预设宽度的预设边框,例如,预设颜色可以为红色,预设宽度可以根据需要设置,例如为20像素-50像素范围内的任意值;预设边框的第一边缘与原始图像的边缘重叠,也即为图中的虚线框,预设边框的第二边缘为图7C中的点划线框,第二边缘与原始图像的边缘相隔第二像素宽度。
图7E示出了图7D所示的初始图像对应的切片图像的示意图。例如,对图7D所示的初始图像以水平中轴线为界,划分成切片图像(1)和切片图像(2),虚线框示出了切片图像(1)和切片图像(2)的切片边缘,其中,切片图像(1)中的除下侧切片边缘外的切片边缘均为第二边缘的一部分,因而在图像处理过程不对切片图像(1)中的下侧切片边缘进行边缘清除处理,对切片图像(1)的上侧切片边缘、左侧切片边缘和右侧切片边缘进行边缘清除处理;切片图像(2)中的除上侧切片边缘外的切片边缘均为第二边缘的一部分,因而在图像处理过程不对切片图像(2)中的上侧切片边缘进行边缘清除处理,对切片图像(2)的下侧切片边缘、左侧切片边缘和右侧切片边缘进行边缘清除处理。
图7F示出了图7E所示的切片图像对应的处理后切片图像的示意图。例如,首先,对切片图像(1)和切片图像(2)进行如上所述的边缘清理,然后,基于本公开的实施例提供的图像处理方法对进行边缘清理之后的切片图像(1)和切片图像(2)进行色调处理,从而得到如图7F所示的处理后切片图像(1)和处理后切片图像(2),处理后切片图像(1)与切片图像(1)对应,处理后切片图像(2)与切片图像(2)对应,处理后切片图像(2)中的上侧边缘图像得以保留,没有被作为不属于图像内容的其他图像而被清理。
例如,本公开至少一实施例提供的边缘清理方法,可以通过给原始图像添加边框标记,例如,通过预设边框来指示原始图像的边缘,从而仅对切片 图像的带有预设边框的边缘的边进行边缘清理,避免丢失非边缘细节。
图8A至图8C示出了执行本公开一实施例提供的图像处理方法的图像处理过程示意图。下面结合图8A至图8C,具体说明本公开至少一实施例提供的图像处理方法的具体执行过程。
图8A本公开一实施例提供的初始图像的示意图,图8A所示的初始图像基于图2所示的原始图像得到。如图8A所示,虚线框为原始图像的边缘,基于原始图像的边缘添加具有预设颜色和预设宽度的预设边框,预设边框的第一边缘与原始图像的边缘重叠,也即为图中的虚线框,预设边框的第二边缘为图8A中的点划线框,第二边缘与原始图像的边缘相隔第二像素宽度。
例如,可以对图8A所示的初始图像执行步骤S20中的切片处理,以得到多个切片图像,其中,切片图像基于包含预设边框的初始图像确定,每个切片图像的尺寸均为切片尺寸,具体过程如前所述,这里不再赘述。之后,对切片图像进行边缘清理,边缘清理的过程如上所述,这里不再赘述。
之后,基于本公开的实施例提供的图像处理方法对进行边缘清理之后的切片图像进行色调处理,具体过程如步骤S30所述,这里不再赘述。
图8B示出了图8A所示的初始图像对应的待处理切片图像的示意图。如图8B所示,图中的待处理切片图像为多个切片图像中的三个切片图像分别对应的三个待处理切片图像,例如,三个待处理切片图像分别为待处理切片图像(1)、待处理切片图像(2)和待处理切片图像(3),其中,待处理切片图像(2)和待处理切片图像(3)中的由点划线框限定的区域为其分别对应的参考区域,参考区域的确定方法以及色调处理过程如前所述,这里不再赘述。
图8C示出了图8B所示的待处理切片图像对应的处理后切片图像的示意图。例如,对图8B所示的待处理切片图像进行如上所述的边缘清除处理以及色调处理,得到如图8C所示的处理后切片图像(1)至处理后切片图像(3)(图8C中虚线框所示)。这里,处理后切片图像(1)与待处理切片图像(1)对应,处理后切片图像(2)与待处理切片图像(2)对应,处理后切片图像(3)与待处理切片图像(3)对应,处理后切片图像(1)、处理后切片图像(2)和处理后切片图像(3)中的边缘图像得以保留。
之后,基于本公开的实施例提供的图像处理方法对进行色调处理后的所有处理后切片图像进行拼合处理,得到合成图像,具体过程如步骤S40所述,这里不再赘述。
图6F示出了图2所示的原始图像对应的处理后图像的示意图。例如,可以对合成图像进行去边框处理,以去除预设边框,从而得到如图6F所示的对应于图2所示的原始图像的处理后图像。
本公开至少一实施例还提供一种图像处理装置,图9为本公开至少一实施例提供的一种图像处理装置的示意性框图。
如图9所示,图像处理装置900可以包括:获取单元901、切片处理单元902、色调处理单元903和合成单元904。
例如,这些模块可以通过硬件(例如电路)模块、软件模块或二者的任意组合等实现,以下实施例与此相同,不再赘述。例如,可以通过中央处理单元(CPU)、图像处理器(GPU)、张量处理器(TPU)、现场可编程逻辑门阵列(FPGA)或者具有数据处理能力和/或指令执行能力的其它形式的处理单元以及相应计算机指令来实现这些单元。
例如,获取单元901被配置为获取初始图像。
例如,切片处理单元902被配置为对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像。
例如,色调处理单元903被配置为所述多个切片图像进行色调处理,以得到多个处理后切片图像,其中,所述多个处理后切片图像的色调保持一致。
例如,合成单元904被配置为根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像。
例如,获取单元901、切片处理单元902、色调处理单元903和合成单元904可以包括存储在存储器中的代码和程序;处理器可以执行该代码和程序以实现如上所述的获取单元901、切片处理单元902、色调处理单元903和合成单元904的一些功能或全部功能。例如,获取单元901、切片处理单元902、色调处理单元903和合成单元904可以是专用硬件器件,用来实现如上所述的获取单元901、切片处理单元902、色调处理单元903和合成单元904的一些或全部功能。例如,获取单元901、切片处理单元902、色调处理单元903和合成单元904可以是一个电路板或多个电路板的组合,用于实现如上所述的功能。在本申请实施例中,该一个电路板或多个电路板的组合可以包括:(1)一个或多个处理器;(2)与处理器相连接的一个或多个非暂时的存储器;以及(3)处理器可执行的存储在存储器中的固件。
需要说明的是,获取单元901可以用于实现图1所示的步骤S10,切片处理单元902可以用于实现图1所示的步骤S20,色调处理单元903可以用于实现图1所示的步骤S30,合成单元904可以用于实现图1所示的步骤S40。从而关于获取单元901、切片处理单元902、色调处理单元903和合成单元904能够实现的功能的具体说明可以参考上述图像处理方法的实施例中的步骤S10至步骤S40的相关描述,重复之处不再赘述。此外,图像处理装置900可以实现与前述图像处理方法相似的技术效果,在此不再赘述。
需要注意的是,在本公开的实施例中,该图像处理装置900可以包括更多或更少的电路或单元,并且各个电路或单元之间的连接关系不受限制,可以根据实际需求而定。各个电路或单元的具体构成方式不受限制,可以根据电路原理由模拟器件构成,也可以由数字芯片构成,或者以其他适用的方式构成。
本公开至少一实施例还提供一种电子设备,图10为本公开至少一实施例提供的一种电子设备的示意图。
例如,如图10所示,电子设备包括处理器1001、通信接口1002、存储器1003和通信总线1004。处理器1001、通信接口1002、存储器1003通过通信总线1004实现相互通信,处理器1001、通信接口1002、存储器1003等组件之间也可以通过网络连接进行通信。本公开对网络的类型和功能在此不作限制。应当注意,图10所示的电子设备的组件只是示例性的,而非限制性的,根据实际应用需要,该电子设备还可以具有其他组件。
例如,存储器1003用于非瞬时性地存储计算机可读指令。处理器1001用于执行计算机可读指令时,实现根据上述任一实施例所述的图像处理方法。关于该图像处理方法的各个步骤的具体实现以及相关解释内容可以参见上述图像处理方法的实施例,在此不作赘述。
例如,处理器1001执行存储器1003上所存放的计算机可读指令而实现的图像处理方法的其他实现方式,与前述方法实施例部分所提及的实现方式相同,这里也不再赘述。
例如,通信总线1004可以是外设部件互连标准(PCI)总线或扩展工业标准结构(EISA)总线等。该通信总线可以分为地址总线、数据总线、控制总线等。为便于表示,图中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
例如,通信接口1002用于实现电子设备与其他设备之间的通信。
例如,处理器1001和存储器1003可以设置在服务器端(或云端)。
例如,处理器1001可以控制电子设备中的其它组件以执行期望的功能。处理器1001可以是中央处理器(CPU)、网络处理器(NP)、张量处理器(TPU)或者图形处理器(GPU)等具有数据处理能力和/或程序执行能力的器件;还可以是数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。中央处理器(CPU)可以为X86或ARM架构等。
存储器1003可以包括一个或多个计算机程序产品的任意组合,计算机程序产品可以包括各种形式的计算机可读存储介质,例如易失性存储器和/或非易失性存储器。易失性存储器例如可以包括随机存取存储器(RAM)和/或高速缓冲存储器(cache)等。非易失性存储器例如可以包括只读存储器(ROM)、硬盘、可擦除可编程只读存储器(EPROM)、便携式紧致盘只读存储器(CD-ROM)、USB存储器、闪存等。在所述计算机可读存储介质上可以存储一个或多个计算机可读指令,处理器1001可以运行所述计算机可读指令,以实现电子设备的各种功能。在存储介质中还可以存储各种应用程序和各种数据等。
例如,在一些实施例中,电子设备还可以包括图像获取部件。图像获取部件用于获取图像。存储器1003还用于存储获取的图像。
例如,图像获取部件可以是智能手机的摄像头、平板电脑的摄像头、个人计算机的摄像头、数码照相机的镜头、或者甚至可以是网络摄像头。
例如,关于电子设备执行图像处理的过程的详细说明可以参考图像处理方法的实施例中的相关描述,重复之处不再赘述。
图11为本公开至少一实施例提供的一种非瞬时性计算机可读存储介质的示意图。例如,如图11所示,存储介质1100可以为非瞬时性计算机可读存储介质,在存储介质1100上可以非暂时性地存储一个或多个计算机可读指令1101。例如,当计算机可读指令1101由处理器执行时可以执行根据上文所述的图像处理方法中的一个或多个步骤。
例如,该存储介质1100可以应用于上述电子设备中,例如,该存储介质1100可以包括电子设备中的存储器。
例如,存储介质可以包括智能电话的存储卡、平板电脑的存储部件、个人计算机的硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM)、便携式紧致盘只读存储器(CD-ROM)、闪存、或者上述存储介质的任意组合,也可以为其他适用的存储介质。
例如,关于存储介质1100的说明可以参考电子设备的实施例中对于存储器的描述,重复之处不再赘述。
图12示出了为本公开至少一实施例提供的一种硬件环境的示意图。本公开提供的电子设备可以应用在互联网系统。
利用图12中提供的计算机系统可以实现本公开中涉及的图像处理装置和/或电子设备的功能。这类计算机系统可以包括个人电脑、笔记本电脑、平板 电脑、手机、个人数码助理、智能眼镜、智能手表、智能指环、智能头盔及任何智能便携设备或可穿戴设备。本实施例中的特定系统利用功能框图解释了一个包含用户界面的硬件平台。这种计算机设备可以是一个通用目的的计算机设备,或一个有特定目的的计算机设备。两种计算机设备都可以被用于实现本实施例中的图像处理装置和/或电子设备。计算机系统可以包括实施当前描述的实现图像处理所需要的信息的任何组件。例如,计算机系统能够被计算机设备通过其硬件设备、软件程序、固件以及它们的组合所实现。为了方便起见,图12中只绘制了一台计算机设备,但是本实施例所描述的实现图像处理所需要的信息的相关计算机功能是可以以分布的方式、由一组相似的平台所实施的,分散计算机系统的处理负荷。
如图12所示,计算机系统可以包括通信端口250,与之相连的是实现数据通信的网络,例如,计算机系统可以通过通信端口250发送和接收信息及数据,即通信端口250可以实现计算机系统与其他电子设备进行无线或有线通信以交换数据。计算机系统还可以包括一个处理器组220(即上面描述的处理器),用于执行程序指令。处理器组220可以由至少一个处理器(例如,CPU)组成。计算机系统可以包括一个内部通信总线210。计算机系统可以包括不同形式的程序储存单元以及数据储存单元(即上面描述的存储器或存储介质),例如硬盘270、只读存储器(ROM)230、随机存取存储器(RAM)240,能够用于存储计算机处理和/或通信使用的各种数据文件,以及处理器组220所执行的可能的程序指令。计算机系统还可以包括一个输入/输出组件260,输入/输出组件260用于实现计算机系统与其他组件(例如,用户界面280等)之间的输入/输出数据流。
通常,以下装置可以连接输入/输出组件260:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置;包括例如液晶显示器(LCD)、扬声器、振动器等的输出装置;包括例如磁带、硬盘等的存储装置;以及通信接口。
虽然图12示出了具有各种装置的计算机系统,但应理解的是,并不要求计算机系统具备所有示出的装置,可以替代地,计算机系统可以具备更多或更少的装置。
对于本公开,还有以下几点需要说明:
(1)本公开实施例附图只涉及到与本公开实施例涉及到的结构,其他结构可参考通常设计。
(2)为了清晰起见,在用于描述本发明的实施例的附图中,层或结构的厚度和尺寸被放大。可以理解,当诸如层、膜、区域或基板之类的元件被称作位于另一元件“上”或“下”时,该元件可以“直接”位于另一元件“上”或“下”,或者可以存在中间元件。
(3)在不冲突的情况下,本公开的实施例及实施例中的特征可以相互组合以得到新的实施例。
以上所述仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,本公开的保护范围应以所述权利要求的保护范围为准。
Claims (16)
- 一种图像处理方法,其特征在于,包括:获取初始图像;对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像;对所述多个切片图像进行色调处理,以得到多个处理后切片图像,其中,所述多个处理后切片图像的色调保持一致;根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像。
- 根据权利要求1所述的图像处理方法,其特征在于,对所述多个切片图像进行色调处理,以得到多个处理后切片图像,包括:确定处理顺序;基于所述处理顺序,确定所述多个切片图像中的基准切片图像,其中,所述基准切片图像为基于所述处理顺序确定的第一个被执行色调处理的切片图像;对所述基准切片图像进行色调处理,以得到所述基准切片图像对应的处理后基准切片图像;将所述处理后基准切片图像的色调作为基准色调;基于所述基准色调,对所述多个切片图像中除所述基准切片图像外的所有切片图像进行色调处理,以得到所述所有切片图像分别对应的处理后切片图像,其中,所述所有切片图像分别对应的处理后切片图像的色调与所述基准色调保持一致,所述多个处理后切片图像包括所述处理后基准切片图像和所述所有切片图像分别对应的处理后切片图像。
- 根据权利要求2所述的图像处理方法,其特征在于,基于所述基准色调,对所述多个切片图像中除所述基准切片图像外的所有切片图像进行色调处理,以得到所述所有切片图像分别对应的处理后切片图像,包括:对于所述所有切片图像中的第i个切片图像:确定所述第i个切片图像对应的参考区域,其中,所述参考区域用于为所述第i个切片图像进行色调处理时提供色调参考,所述参考区域的色调与所述基准色调保持一致;基于所述参考区域,对所述第i个切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,其中,i为正整数。
- 根据权利要求3所述的图像处理方法,其特征在于,确定所述第i个切片图像对应的参考区域,包括:确定所述多个切片图像中与所述第i个切片图像对应的参考切片图像,其中,所述参考切片图像中与所述第i个切片图像重叠的区域为重叠区域,且在所述第i个切片图像进行色调处理前,所述参考切片图像已被执行色调处理;获取所述参考切片图像对应的处理后参考切片图像;基于所述重叠区域和所述处理后参考切片图像,确定处理后重叠区域,其中,所述处理后重叠区域为所述处理后参考切片图像中与所述重叠区域所对应的区域;确定所述处理后重叠区域中的至少部分区域为所述参考区域。
- 根据权利要求4所述的图像处理方法,其特征在于,确定所述处理后重叠区域中的至少部分区域为所述参考区域,包括:响应于所述处理后重叠区域在延伸方向上的宽度等于第一像素宽度,将所述处理后重叠区域作为所述参考区域;响应于所述处理后重叠区域在所述延伸方向上的宽度大于所述第一像素宽度,选择所述处理后重叠区域中的部分区域作为所述参考区域,其中,所述部分区域在所述排列方向上的宽度为所述第一像素宽度;其中,所述延伸方向为所述处理后重叠区域的中心和所述处理后参考切片图像中除了所述处理后重叠区域之外的区域的中心的连线方向。
- 根据权利要求3所述的图像处理方法,其特征在于,基于所述参考区域,对所述第i个切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,包括:确定在所述第i个切片图像中与所述参考区域对应的区域为第一区域;确定所述第i个切片图像中除所述第一区域以外的区域为第二区域;将所述第二区域和所述参考区域进行拼合处理以得到所述第i个切片图像对应的待处理切片图像,其中,所述待处理切片图像的尺寸与所述第i个切片图像的尺寸相同;对所述待处理切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像。
- 根据权利要求6所述的图像处理方法,其特征在于,对所述待处理切片图像进行色调处理,以得到所述第i个切片图像对应的处理后切片图像,包括:基于所述待处理切片图像中的所述参考区域,对所述待处理切片图像中的所述第二区域进行色调处理,以得到与所述第二区域对应的处理后区域,其中,所述参考区域用于为所述第二区域的色调处理提供色调参考,所述处理后区域的色调与所述参考区域的色调保持一致;基于所述参考区域和所述处理后区域,得到所述第i个切片图像对应的处理后切片图像。
- 根据权利要求1所述的图像处理方法,其特征在于,对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像,包括:确定切片尺寸;根据所述切片尺寸,将所述初始图像划分为多个切片区域,其中,每个切片区域和与所述每个切片区域相邻的所有切片区域之间均至少部分重叠;根据所述多个切片区域,对所述初始图像进行切片处理,以得到所述多个切片图像,其中,所述多个切片图像与所述多个切片区域一一对应,每个切片图像包括所述多个切片区域中的一个切片区域。
- 根据权利要求1所述的图像处理方法,其特征在于,所述合成图像中的所有像素排列为n行m列,根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像,包括:根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到中间合成图像,其中,所述中间合成图像 中的所有像素排列为n行m列;在所述中间合成图像中的第t1行第t2列仅包括一个像素的情况下,将位于第t1行第t2列的所述一个像素的像素值作为所述合成图像中的第t1行第t2列的像素的像素值;在所述中间合成图像中的第t1行第t2列包括多个像素的情况下,从所述多个像素中选择任意一个像素的像素值作为所述合成图像中的第t1行第t2列的像素的像素值,其中,n、m、t1、t2均为正整数,且t1小于等于n,t2小于等于m。
- 根据权利要求1所述的图像处理方法,其特征在于,获取初始图像包括:获取原始图像;沿所述原始图像的边缘向远离所述原始图像的中心的一侧添加预设边框,以得到所述初始图像,其中,所述初始图像包括所述原始图像和所述预设边框,所述预设边框的颜色为预设颜色;并且对所述多个切片图像进行色调处理前,所述方法还包括:对所述多个切片图像进行边缘清除处理。
- 根据权利要求10所述的图像处理方法,其特征在于,沿所述原始图像的边缘向远离所述原始图像的中心的一侧添加预设边框,以得到所述初始图像,包括:确定所述原始图像的边缘;基于所述原始图像的边缘生成所述预设边框,其中,所述预设边框包括第一边缘和第二边缘;将所述预设边框设置于所述原始图像的边缘的远离所述原始图像的中心的一侧,以得到所述初始图像,其中,所述预设边框的第一边缘与所述原始图像的边缘重叠,所述预设边框的第二边缘与所述原始图像的边缘相隔第二像素宽度,所述初始图像的边缘为所述第二边缘。
- 根据权利要求11所述的图像处理方法,其特征在于,每个切片图像包括四条切片边缘,对所述多个切片图像进行边缘清除处理,包括:响应于检测到所述每个切片图像的第i条切片边缘为所述第二边缘的一部分,对所述第i条切片边缘进行边缘清除处理,响应于检测到所述每个切片图像的第i条切片边缘不为所述第二边缘的一部分,不对所述第i条切片边缘进行边缘清除处理,其中,i为正整数且小于等于4。
- 根据权利要求1-12任一项所述的图像处理方法,其特征在于,所述合成图像为彩色图像。
- 一种图像处理装置,其特征在于,包括:获取单元,配置为获取初始图像;切片处理单元,配置为对所述初始图像进行切片处理,以得到所述初始图像对应的多个切片图像;色调处理单元,配置为对所述多个切片图像进行色调处理,以得到多个处理后切片图像,其中,所述多个处理后切片图像的色调保持一致;合成单元,配置为根据所述多个切片图像在所述初始图像中的位置关系,对所述多个处理后切片图像进行拼合处理,以得到合成图像。
- 一种电子设备,其特征在于,包括:存储器,非瞬时性地存储有计算机可执行指令;处理器,配置为运行所述计算机可执行指令,其中,所述计算机可执行指令被所述处理器运行时实现根据权利要求1-13任一项所述的图像处理方法。
- 一种非瞬时性计算机可读存储介质,其特征在于,所述非瞬时性计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现根据权利要求1-13中任一项所述的图像处理方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110380615.1A CN113096043B (zh) | 2021-04-09 | 2021-04-09 | 图像处理方法及装置、电子设备和存储介质 |
CN202110380615.1 | 2021-04-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022213784A1 true WO2022213784A1 (zh) | 2022-10-13 |
Family
ID=76675390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/081286 WO2022213784A1 (zh) | 2021-04-09 | 2022-03-16 | 图像处理方法及装置、电子设备和存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113096043B (zh) |
WO (1) | WO2022213784A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113096043B (zh) * | 2021-04-09 | 2023-02-17 | 杭州睿胜软件有限公司 | 图像处理方法及装置、电子设备和存储介质 |
CN116596931B (zh) * | 2023-07-18 | 2023-11-17 | 宁德时代新能源科技股份有限公司 | 图像处理方法、装置、设备、存储介质和程序产品 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040258302A1 (en) * | 2003-04-08 | 2004-12-23 | Seiko Epson Corporation | Image processor, image processing method and program |
CN101441763A (zh) * | 2008-11-11 | 2009-05-27 | 浙江大学 | 基于颜色传递的多色调图像统一调整方法 |
CN106846285A (zh) * | 2016-12-30 | 2017-06-13 | 苏州中科天启遥感科技有限公司 | 高性能遥感影像合成方法及装置 |
CN108564532A (zh) * | 2018-03-30 | 2018-09-21 | 合肥工业大学 | 大尺度地距星载sar图像镶嵌方法 |
CN113096043A (zh) * | 2021-04-09 | 2021-07-09 | 杭州睿胜软件有限公司 | 图像处理方法及装置、电子设备和存储介质 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009065364A (ja) * | 2007-09-05 | 2009-03-26 | Toshiba Corp | 画像処理方法、画像処理装置および記録物 |
US9576349B2 (en) * | 2010-12-20 | 2017-02-21 | Microsoft Technology Licensing, Llc | Techniques for atmospheric and solar correction of aerial images |
CN103279939B (zh) * | 2013-04-27 | 2016-01-20 | 北京工业大学 | 一种图像拼接处理系统 |
CN103489171B (zh) * | 2013-09-22 | 2016-04-27 | 武汉大学 | 基于标准色彩库的大范围遥感影像自动化匀光匀色方法 |
JPWO2016067456A1 (ja) * | 2014-10-31 | 2017-08-10 | オリンパス株式会社 | 画像処理方法および細胞分取方法 |
CN104933671B (zh) * | 2015-05-25 | 2018-05-25 | 北京邮电大学 | 图像颜色融合方法 |
CN105427372A (zh) * | 2015-06-11 | 2016-03-23 | 北京吉威时代软件股份有限公司 | 基于tin网的正射影像拼接色彩一致性处理技术 |
CN104992408B (zh) * | 2015-06-30 | 2018-06-05 | 百度在线网络技术(北京)有限公司 | 用于用户终端的全景图像生成方法和装置 |
EP3236486A1 (en) * | 2016-04-22 | 2017-10-25 | Carl Zeiss Microscopy GmbH | Method for generating a composite image of an object and particle beam device for carrying out the method |
CN105931186B (zh) * | 2016-04-26 | 2019-04-02 | 电子科技大学 | 基于相机自动标定和颜色校正的全景视频拼接系统与方法 |
CN106254844B (zh) * | 2016-08-25 | 2018-05-22 | 成都易瞳科技有限公司 | 一种全景拼接颜色校正方法 |
CN106412461B (zh) * | 2016-09-14 | 2019-07-23 | 豪威科技(上海)有限公司 | 视频拼接方法 |
CN108230376B (zh) * | 2016-12-30 | 2021-03-26 | 北京市商汤科技开发有限公司 | 遥感图像处理方法、装置和电子设备 |
CN107820067B (zh) * | 2017-10-29 | 2019-09-20 | 苏州佳世达光电有限公司 | 多投影画面的拼接方法及拼接装置 |
CN109493281A (zh) * | 2018-11-05 | 2019-03-19 | 北京旷视科技有限公司 | 图像处理方法、装置、电子设备及计算机可读存储介质 |
CN109697705B (zh) * | 2018-12-24 | 2019-09-03 | 北京天睿空间科技股份有限公司 | 适于视频拼接的色差矫正方法 |
CN112070708B (zh) * | 2020-08-21 | 2024-03-08 | 杭州睿琪软件有限公司 | 图像处理方法、图像处理装置、电子设备、存储介质 |
CN112007359B (zh) * | 2020-08-24 | 2023-12-15 | 杭州睿琪软件有限公司 | 一种图像展示方法、可读存储介质和计算机设备 |
CN112149561B (zh) * | 2020-09-23 | 2024-04-16 | 杭州睿琪软件有限公司 | 图像处理方法和装置、电子设备和存储介质 |
-
2021
- 2021-04-09 CN CN202110380615.1A patent/CN113096043B/zh active Active
-
2022
- 2022-03-16 WO PCT/CN2022/081286 patent/WO2022213784A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040258302A1 (en) * | 2003-04-08 | 2004-12-23 | Seiko Epson Corporation | Image processor, image processing method and program |
CN101441763A (zh) * | 2008-11-11 | 2009-05-27 | 浙江大学 | 基于颜色传递的多色调图像统一调整方法 |
CN106846285A (zh) * | 2016-12-30 | 2017-06-13 | 苏州中科天启遥感科技有限公司 | 高性能遥感影像合成方法及装置 |
CN108564532A (zh) * | 2018-03-30 | 2018-09-21 | 合肥工业大学 | 大尺度地距星载sar图像镶嵌方法 |
CN113096043A (zh) * | 2021-04-09 | 2021-07-09 | 杭州睿胜软件有限公司 | 图像处理方法及装置、电子设备和存储介质 |
Non-Patent Citations (2)
Title |
---|
HUANG, SHICUN; WANG AICHUN; LI JUNJIE; CHEN JUNYING; HAN QIJIN: "The First National Water Conservancy Census Soil and Water Conservation Census-Realization of the Environment-1 Satellite Remote Sensing Image Project", PROCEEDINGS OF THE 16TH CHINA ENVIRONMENTAL REMOTE SENSING APPLICATION TECHNOLOGY FORUM, CN, 29 March 2012 (2012-03-29), CN, pages 177 - 185, XP009541866 * |
YANG YUHANG: "The Research and Realization of Color Normalization and Cloud Removal in Large Scale Remote Sensing Image", MASTER THESIS, TIANJIN POLYTECHNIC UNIVERSITY, CN, no. 6, 15 June 2020 (2020-06-15), CN , XP055974887, ISSN: 1674-0246 * |
Also Published As
Publication number | Publication date |
---|---|
CN113096043B (zh) | 2023-02-17 |
CN113096043A (zh) | 2021-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022213784A1 (zh) | 图像处理方法及装置、电子设备和存储介质 | |
US20200258197A1 (en) | Method for generating high-resolution picture, computer device, and storage medium | |
CN108304814B (zh) | 一种文字类型检测模型的构建方法和计算设备 | |
JP6970283B2 (ja) | 画像ステッチング方法および装置、記憶媒体 | |
CN106991422B (zh) | 字符切割方法、装置及计算机可读存储介质和电子设备 | |
US11985287B2 (en) | Image processing method, image processing device, electronic apparatus and storage medium | |
US9129385B2 (en) | Image processing device, image processing method and apparatus for high precision document corners extraction | |
CN111179166B (zh) | 图像处理方法、装置、设备及计算机可读存储介质 | |
CN114792283A (zh) | 一种图像处理方法、装置、设备及计算机可读存储介质 | |
US9767533B2 (en) | Image resolution enhancement based on data from related images | |
WO2022002002A1 (zh) | 图像处理方法、图像处理装置、电子设备、存储介质 | |
US20080170812A1 (en) | Image composition processing method, computer system with image composition processing function | |
WO2022247702A1 (zh) | 图像处理方法及装置、电子设备和存储介质 | |
WO2023130966A1 (zh) | 图像处理方法、图像处理装置、电子设备、存储介质 | |
US10235786B2 (en) | Context aware clipping mask | |
US10944884B2 (en) | Imaging device and non-transitory computer readable medium storing program | |
US9031324B2 (en) | Image-processing device specifying encircling line for identifying sub-region of image | |
CN112465931A (zh) | 图像文本抹除方法、相关设备及可读存储介质 | |
WO2019227300A1 (zh) | 版面元素的处理方法、装置、存储介质及电子设备/终端/服务器 | |
JP6447890B2 (ja) | 線分抽出装置、線分抽出装置の制御方法及びプログラム | |
JP2011141599A (ja) | 画像処理方法、画像処理装置、およびプログラム | |
CN111354058A (zh) | 一种图像着色方法、装置、图像采集设备及可读存储介质 | |
TWI442329B (zh) | 在輸入影像中取出具有文字特徵之影像的影像處理方法及其相關裝置 | |
JP2013065164A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2015138417A (ja) | 画像処理装置および画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22783853 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22783853 Country of ref document: EP Kind code of ref document: A1 |