WO2023151385A1 - 图像处理方法、装置、终端和可读存储介质 - Google Patents

图像处理方法、装置、终端和可读存储介质 Download PDF

Info

Publication number
WO2023151385A1
WO2023151385A1 PCT/CN2022/139725 CN2022139725W WO2023151385A1 WO 2023151385 A1 WO2023151385 A1 WO 2023151385A1 CN 2022139725 W CN2022139725 W CN 2022139725W WO 2023151385 A1 WO2023151385 A1 WO 2023151385A1
Authority
WO
WIPO (PCT)
Prior art keywords
processed
filtering
image
area
image block
Prior art date
Application number
PCT/CN2022/139725
Other languages
English (en)
French (fr)
Inventor
李勇华
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023151385A1 publication Critical patent/WO2023151385A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management

Definitions

  • the present application relates to the field of image technology, and in particular to an image processing method, an image processing device, a terminal, and a non-volatile computer-readable storage medium.
  • the image is divided into blocks, and when the image block is processed, the image data in the processing area corresponding to each image block is acquired.
  • Embodiments of the present application provide an image processing method, an image processing device, a terminal, and a non-volatile computer-readable storage medium.
  • the image processing method in the embodiment of the present application includes determining the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer; obtaining the first filter area of the image block to be processed and the processing the overlapping area between the second filtering areas of adjacent image blocks to be processed; and acquiring first image data and second image data, the first image data being included in the first filtering area, the Image data of an area other than the overlapping area, the second image data includes the image data of the overlapping area, and the second image data is used when filtering the adjacent image blocks to be processed.
  • the image processing device in the embodiment of the present application includes a first determination module, a first acquisition module, and a second acquisition module.
  • the first determination module is used to determine the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer; the first acquisition module is used to obtain the first filter area of the image block to be processed currently.
  • the first image data includes the image data of the area outside the overlapping area in the first filtering area
  • the second image data includes the image data of the overlapping area
  • the second image data is provided for the corresponding It is used when filtering adjacent image blocks to be processed.
  • the terminal in the embodiment of the present application includes a processor, the processor is configured to determine the filtering area of each image block to be processed in the image to be processed according to the filtering radius of the current filtering layer; acquire the first filtering area of the image block to be processed currently and the overlapping area between the second filtering area of the image block to be processed adjacent to the current image block to be processed; and acquiring first image data and second image data, the first image data including the first In the filtering area, the image data of the area other than the overlapping area, the second image data includes the image data of the overlapping area, and the second image data is used for filtering the adjacent image blocks to be processed use.
  • a non-volatile computer-readable storage medium containing a computer program of the present application when the computer program is executed by one or more processors, the processors are made to execute an image processing method.
  • the image processing method includes determining the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer; obtaining the first filter area of the image block to be processed and the first filter area of the image block to be processed an overlapping area between the second filtering areas of adjacent image blocks to be processed; and obtaining first image data and second image data, the first image data including in the first filtering area, between the overlapping areas
  • the second image data includes the image data of the overlapping area, and the second image data is used for filtering the adjacent image blocks to be processed.
  • FIG. 1 is a schematic flow diagram of an image processing method in some embodiments of the present application.
  • FIG. 2 is a schematic block diagram of an image processing device in some embodiments of the present application.
  • FIG. 3 is a schematic plan view of a terminal in some embodiments of the present application.
  • FIGS. 4 to 7 are schematic diagrams of some embodiments of the present application.
  • FIG. 8 and FIG. 9 are schematic flowcharts of image processing methods in some embodiments of the present application.
  • Fig. 10 is a schematic diagram of the principles of some embodiments of the present application.
  • FIG. 11 and FIG. 12 are schematic flowcharts of image processing methods in some embodiments of the present application.
  • Fig. 13 is a schematic diagram of connection between a processor and a computer-readable storage medium in some embodiments of the present application.
  • the image processing method of the present application includes: according to the filtering radius of the current filtering layer, determining the filtering area of each image block to be processed in the image to be processed; The overlapping area between the second filtering areas of adjacent image blocks to be processed; and obtaining the first image data and the second image data, the first image data including the image data of the area outside the overlapping area in the first filtering area , the second image data includes image data of overlapping regions, and the second image data is used when filtering adjacent image blocks to be processed.
  • the filtering layer includes multiple layers, and the image to be processed is sequentially filtered through the multi-layer filtering layer.
  • the image processing method also includes: according to the preset sub-filtering radius of the current filtering layer, and the sub-filtering radius after the current filtering layer The preset sub-filter radius of the filter layer determines the filter radius of the current filter layer.
  • the image processing method further includes: according to the sub-filter radius of the filter layer after the current filter layer, determining the output area of the filter area of the current filter layer; according to the position information and output area of the current image block to be processed, And the output area of the image block to be processed that has completed the filtering process adjacent to the image block to be processed currently, determine the calculation area of the image block to be processed currently; according to the sub-filter radius of the current filter layer and the calculation area, re-determine the first
  • the filtering area: acquiring the first image data and the second image data includes: acquiring the image data corresponding to the re-determined first filtering area in the set composed of the first image data and the second image data.
  • the position information includes at least one of the first edge, the second edge and the middle position of the image to be processed, according to the position information and the output area of the current image block to be processed, and adjacent to the current image block to be processed
  • the output area of the image block to be processed that has completed the filtering process, and the calculation area of the image block to be processed currently is determined, including: when the position information includes the first edge, according to the output area of the image block to be processed currently, and the current image block to be processed
  • the output area of the image block to be processed adjacent to the first edge of the image block determines the calculation area; when the position information includes the second edge, according to the output area of the current image block to be processed and the second edge of the current image block to be processed
  • the output area of the image block to be processed adjacent to the edge determines the calculation area; when the position information includes the first edge and the second edge, the calculation area is determined according to the output area of the current image block to be processed; when the position information is an intermediate position , determining the calculation area according to
  • the image processing method further includes: performing filtering processing on the image data corresponding to the re-determined first filtering area to obtain the first filtering data; according to the position information of the current image block to be processed, obtaining In the output area of the image block to be processed that has completed the filtering process adjacent to the image block, the second filter data of the overlapping part that overlaps with the output area of the current image block to be processed; according to the first filter data and the second filter data, output The filtered data of the output area of the current image block to be processed.
  • the filtering radius includes a first filtering radius, a second filtering radius, a third filtering radius and a fourth filtering radius, and the first filtering radius, the second filtering radius, the third filtering radius and the fourth filtering radius are all The same; or the first filtering radius is the same as the third filtering radius, and the second filtering radius is the same as the fourth filtering radius; or, the first filtering radius, the second filtering radius, the third filtering radius and the fourth filtering radius are different from each other.
  • the image processing method further includes storing the second image data in the memory after the filtering process of the image block to be processed is completed.
  • the image processing method further includes re-determining a filter radius according to preset downsampling parameters, and performing filter processing on the image block to be processed.
  • the image processing method further includes: after the current image block to be processed completes the filtering process and outputs the filter data of the output area of the current image block to be processed, according to the position information of the current image block to be processed An image block to be processed that is adjacent to the image block and has not been filtered.
  • the image processing device of the present application includes a first determination module, a first acquisition module, and a second acquisition module.
  • the first determination module is used to determine the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer;
  • the first acquisition module is used to obtain the first filter area of the current image block to be processed and the current The overlapping area between the second filter areas of the image blocks to be processed adjacent to the image block to be processed;
  • the second acquisition module is used to acquire the first image data and the second image data, the first image data is included in the first filter area,
  • the image data of the area other than the overlapping area, the second image data includes the image data of the overlapping area, and the second image data is used for filtering by adjacent image blocks to be processed.
  • the terminal of the present application includes a processor, and the processor is used to determine the filtering area of each image block to be processed in the image to be processed according to the filtering radius of the current filtering layer; Processing the overlapping area between the second filtering areas of the adjacent image blocks to be processed; and obtaining the first image data and the second image data, the first image data including the area outside the overlapping area in the first filtering area image data, the second image data includes image data of overlapping regions, and the second image data is used for filtering adjacent image blocks to be processed.
  • the filter layer includes multiple layers, and the image to be processed is sequentially filtered through the multi-layer filter layer, and the processor is also used for filtering according to the preset sub-filter radius of the current filter layer and the filter after the current filter layer.
  • the preset sub-filter radius of the layer which determines the filter radius of the current filter layer.
  • the processor is further configured to determine the output area of the filter area of the current filter layer according to the sub-filter radius of the filter layer after the current filter layer; according to the position information and output area of the current image block to be processed, and Determine the calculation area of the current image block to be processed in the output area of the image block to be processed that has completed the filtering process adjacent to the current image block to be processed; according to the sub-filter radius of the current filter layer and the calculation area, re-determine the first filter Region: acquiring image data corresponding to the re-determined first filtering region in a set composed of the first image data and the second image data.
  • the position information includes at least one of the first edge, the second edge and the middle position of the image to be processed
  • the processor is further configured to, when the position information includes the first edge, output area, and the output area of the image block to be processed adjacent to the first edge of the image block to be processed currently, determine the calculation area; when the position information includes the second edge, according to the output area of the image block to be processed currently, and the current The output area of the image block to be processed adjacent to the second edge of the image block to be processed determines the calculation area; when the position information includes the first edge and the second edge, the calculation area is determined according to the output area of the current image block to be processed; When the position information is an intermediate position, the calculation area is determined according to the output area of the current image block to be processed and the output areas of two image blocks to be processed respectively adjacent to the first edge and the second edge of the current image block to be processed .
  • the processor is further configured to filter the image data corresponding to the re-determined first filtering region to obtain the first filtering data; according to the position information of the image block currently to be processed, obtain In the output area of the image block to be processed that has completed the filtering process adjacent to the image block, the second filter data of the overlapping part that overlaps with the output area of the current image block to be processed; according to the first filter data and the second filter data, output The filtered data of the output area of the current image block to be processed.
  • the filtering radius includes a first filtering radius, a second filtering radius, a third filtering radius and a fourth filtering radius, and the first filtering radius, the second filtering radius, the third filtering radius and the fourth filtering radius are all The same; or the first filtering radius is the same as the third filtering radius, and the second filtering radius is the same as the fourth filtering radius; or, the first filtering radius, the second filtering radius, the third filtering radius and the fourth filtering radius are different from each other.
  • the processor is further configured to store the second image data in the memory after the filtering process of the image block to be processed is completed.
  • the processor is further configured to re-determine a filtering radius according to a preset down-sampling parameter, and perform filtering processing on the image block to be processed.
  • the processor is further configured to determine, according to the position information of the current image block to be processed, the current image block to be processed and An image block to be processed that is adjacent to the image block and has not been filtered.
  • the non-transitory computer-readable storage medium of the present application includes a computer program.
  • the processor is made to execute the image processing method in any one of the above-mentioned embodiments.
  • the image processing method of the embodiment of the present application comprises the following steps:
  • 012 Obtain the overlapping area between the first filtering area of the current image block to be processed and the second filtering area of the image block to be processed adjacent to the current image block to be processed;
  • the first image data includes the image data of the area outside the overlapping area in the first filter area
  • the second image data includes the image data of the overlapping area
  • the second image data is for It is used when filtering adjacent image blocks to be processed.
  • the image processing device 10 in the embodiment of the present application includes a first determination module 11 , a first acquisition module 12 and a second acquisition module 13 .
  • the first determination module 11 , the first acquisition module 12 and the second acquisition module 13 are respectively used to execute step 011 , step 012 and step 013 .
  • the first determination module 11 is used to determine the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer; the first acquisition module 12 is used to obtain the first filter region of the image block to be processed The overlapping area between the area and the second filtering area of the image block to be processed adjacent to the image block to be processed; the second acquisition module 13 is used to acquire the first image data and the second image data, and the first image data includes the first image data In a filtering area, the image data of areas other than the overlapping area, the second image data includes the image data of the overlapping area, and the second image data is used for filtering by adjacent image blocks to be processed.
  • the terminal 100 in the embodiment of the present application includes a processor 30 .
  • the processor 30 is configured to determine the filter area of each image block to be processed in the image to be processed according to the filter radius of the current filter layer; The overlapping area between the second filtering area of the image block to be processed; and obtaining the first image data and the second image data, the first image data includes the image data of the area outside the overlapping area in the first filtering area, and the second The image data includes image data of overlapping regions, and the second image data is used for filtering adjacent image blocks to be processed. That is to say, step 011 , step 012 and step 013 can be implemented by the processor 30 .
  • the terminal 100 further includes a casing 40 .
  • the terminal 100 may be a mobile phone, a tablet computer, a display device, a notebook computer, an teller machine, a gate, a smart watch, a head-mounted display device, a game console, and the like. As shown in FIG. 3 , the embodiment of the present application is described by taking the terminal 100 as an example of a mobile phone. It can be understood that the specific form of the terminal 100 is not limited to the mobile phone.
  • the housing 40 can also be used to install functional modules such as a display device, an imaging device, a power supply device, and a communication device of the terminal 100, so that the housing 40 provides protection against dust, drop, and water for the functional modules.
  • the image to be processed may be an image taken by the camera 20 of the terminal 100, or an image downloaded from the Internet, which is not limited here.
  • the image to be processed may also be a part of the image captured by the camera 20 .
  • the image to be processed may also be a depth image.
  • the preset filtering algorithm can determine the preset size of the image to be processed, the preset number of filter layers, and the filter radius of each filter layer.
  • the image to be processed includes a first edge, a second edge, a third edge and a fourth edge, the first edge is opposite to the third edge, and the second edge is opposite to the fourth edge.
  • the first edge can be the top edge and the second edge can be the left edge; or, the first edge can be the bottom edge and the second edge can be the right edge; or, the first edge can be the top edge and the second edge can be the right edge Edge etc.
  • the first edge may be the upper edge, and the second edge may be the left edge as an example for illustration.
  • the image to be processed can be divided into multiple image blocks to be processed by the preset size, so that the processor 30 can directly obtain the position of each image block to be processed Information, such as the vertex coordinates of the image block to be processed (take the image block to be processed as a rectangle as an example), the position of the image block to be processed in the image to be processed (such as the image block to be processed is located on the left edge, the upper edge, and the left edge and top edge (i.e. top left corner)).
  • the processor 30 can also directly acquire the filtering radius of each filtering layer.
  • the processor 30 can determine the filtering area of each image block to be processed according to the filtering radius of the current filtering layer.
  • the filtering radius is the distance from the edge of the image block to be processed.
  • the filtering area may be determined by expanding the edge of the image block to be processed according to the filtering radius.
  • the filtering area may be determined according to the position information of the image block to be processed (such as the vertex coordinates of the image block to be processed) and the filtering radius.
  • the processor 30 can determine the vertex coordinates of the filtering area according to the vertex coordinates and the filtering radius of the image block to be processed.
  • an image coordinate system is established with the upper left corner of the image block A1 to be processed as the origin , the width of the image block A1 to be processed is 8 (the number of pixels along the W direction), and the height is 8 (the number of pixels along the H direction), then the vertex coordinates of the image block A1 to be processed are (0, 0), (8 ,0), (0,8) and (8,8), if the filter radius is 2 pixels, add two rows of pixels on the upper edge and lower edge of the image block A1 to be processed, and add two rows of pixels on the left edge and right edge respectively Column pixels, the vertex coordinates of the image block S1 to be processed are (-2,-2), (10,2), (-2,10) and (10,10) respectively. In this way, according to the position information of the image block A1 to be processed and the filtering radius of each filter layer,
  • the filtering radius includes a first filtering radius (such as corresponding to the upper side of the image), a second filtering radius (such as corresponding to the right side of the image), a third filtering radius (such as corresponding to the lower side of the image) and a fourth filtering radius ( As the left side of the corresponding image), the first filter radius, the second filter radius, the third filter radius and the fourth filter radius are all the same (as shown in Figure 4); or the first filter radius is the same as the third filter radius, The second filtering radius and the fourth filtering radius are the same; or, the first filtering radius, the second filtering radius, the third filtering radius and the fourth filtering radius are different from each other.
  • the image coordinate system is established with the upper left corner of the image block A1 to be processed as the origin, and the coordinates of the vertices of the image block A1 to be processed are respectively (0,0), (8,0), (0,8) and (8,8), if the first filter radius and the third filter radius are the same and both are 2 pixels, and the second filter radius and the fourth filter radius are the same and both are 1 pixel, then the Add two rows of pixels to the upper edge and lower edge of the processed image block A1, and add a column of pixels to the left edge and right edge respectively, then the vertex coordinates of the image block S1 to be processed are (-1,-2), (9,-2 ), (-1,10) and (9,10).
  • the filtering area S1 of each filtering layer can be quickly determined.
  • the image coordinate system is established with the upper left corner of the image block A1 to be processed as the origin, and the vertex coordinates of the image block A1 to be processed are respectively (0,0), (8,0), (0,8) and (8,8), such as the first filter radius, the second filter radius, the third filter radius and the fourth filter radius are 4 pixels, 3 pixels, 2 pixels, 1 pixel respectively, Then add 4 rows of pixels and 2 rows of pixels on the upper edge and lower edge of the image block A1 to be processed, respectively, and add 1 column of pixels and 3 columns of pixels to the left edge and the right edge, respectively, then the vertex coordinates of the image block S1 to be processed are respectively ( -1,-4), (11,-4), (-1,10) and (11,10).
  • the image block S1 to be processed of each filter layer can be quickly determined and obtained.
  • the processing area is generally larger than the area where the image block itself is located.
  • the amount of image data required for image processing is relatively large.
  • the two image blocks to be processed need to read the image data (that is, the first image data) of the area other than the overlapping area from the memory 50 (such as dynamic random access memory 50, DRAM) respectively, and also Each needs to read the second image data from the terminal memory 50 to the memory 60 (such as the tightly coupled memory (Tightly Coupled Memories, TCM) or the vision processing unit (Vision Processing Unit, VPU) on the processing chip of the terminal 100). memory, etc.).
  • the second image data needs to be read twice from the memory 50, that is, the total amount of data to be read is 2*(first image data+second image data) two image data).
  • the second image data can be stored in the level-1 cache of the VPU, so that the adjacent unfiltered image to be processed can be conveniently processed.
  • the block is used when filtering, and can be read and written faster when the second image data is multiplexed, improving the filtering efficiency and lowering the power consumption.
  • first image data of different image blocks to be processed may be different, and the amount of read data here is only for illustration, and cannot be limited to the first image data of two adjacent image blocks to be processed same.
  • the second image data is given to the current image block to be processed.
  • Adjacent and unfiltered image blocks to be processed are multiplexed, that is, the second image data only needs to be read once from the memory 50 to the memory 60, so as to realize filtering processing of two adjacent image blocks to be processed, and a total of read
  • the amount of data is 2*first image data+second image data. That is to say, by adopting the technical solution of the present application, the reading of one second image data can be reduced for every two adjacent image blocks to be processed, thereby reducing the amount of read data for filtering processing, so as to improve filter efficiency.
  • the processor 30 may store the second image data of the image block to be processed in the memory 60, so that the image block to be processed which is adjacent to the image to be processed and not subjected to filtering process can be filtered.
  • a transfer memory can be set separately in the internal memory 60 (a part of the storage space is determined as a transfer memory in the first-level cache of the VPU), and after the current image block to be processed has obtained the image data in the first filter area, the first The second image data in the overlapping area in the filter area is stored in the transfer memory for filtering processing by the image blocks to be processed adjacent to the current image block to be processed, so that the image blocks to be processed adjacent to the current image block to be processed can only It is only necessary to read the first image data in the second filtering area and outside the overlapping area from the memory 50 .
  • the image data existing in the filtering area can be obtained directly (such as the part where the filtering area overlaps with the image to be processed), and for the part where there is no image data in the filtering area (the part where the filtering area does not overlap with the image to be processed), it needs to be obtained according to The existing image data in the filtering area is filled, so as to obtain the image data of each pixel in the filtering area.
  • the image processing method, the image processing device 10 and the terminal 100 of the embodiment of the present application first determine the filter area of the pre-segmented image block to be processed in the image to be processed according to the filter radius of the current filter layer, and the area of the filter area is larger than For the image block to be processed, there is an overlapping area between the filtering areas of adjacent image blocks to be processed, therefore, when performing filtering processing on the current image block to be processed, the overlapping area in the first filtering area of the current image block to be processed is acquired
  • the image data of the overlapping area between the second filtering areas of the image blocks to be processed therefore, the second image data can be multiplexed by the corresponding adjacent image blocks to be processed, compared to obtaining its own for each image block to be processed Corresponding to the image
  • the filter layer includes multiple layers, and the image to be processed is filtered through the multi-layer filter layer successively, and the image processing method also includes:
  • 014 Determine the filtering radius of the current filtering layer according to the preset sub-filtering radius of the current filtering layer and the preset sub-filtering radius of the filtering layer after the current filtering layer.
  • the image processing device 10 further includes a second determination module 14 .
  • the second determining module 14 is used to execute step 014 . That is, the second determining module 14 is configured to determine the filtering radius of the current filtering layer according to the preset sub-filtering radius of the current filtering layer and the preset sub-filtering radius of the filtering layer after the current filtering layer.
  • the processor 30 is further configured to determine the filtering radius of the current filtering layer according to the preset sub-filtering radius of the current filtering layer and the preset sub-filtering radii of the filtering layers after the current filtering layer. That is to say, step 0121 and step 0122 may be implemented by the processor 30 .
  • the filter layer includes multiple layers that sequentially process the image to be processed, such as 1, 2, 3, 4, 5, and so on.
  • the filtering layer including two layers as an example, after the image data of the filtering area of the image block to be processed is sequentially processed by the first layer and the second layer, the filtering can be completed to generate a filtered image.
  • the processor 30 can be based on the preset sub-filter radius of the current filter layer and the sub-filter radius of the filter layer after the current filter layer. Radius to determine the filter radius of the current filter layer.
  • the filtering layer includes 3 layers, the sub-filtering radius of the first layer is r1, the sub-filtering radius of the second layer is r2, and the sub-filtering radius of the third layer is r3.
  • the image processing method also includes the following steps:
  • 015 Determine the output area of the filtering area of the current filtering layer according to the sub-filtering radius of the filtering layer after the current filtering layer;
  • 016 Determine the calculation area of the current image block to be processed according to the position information and output area of the image block to be processed currently, and the output area of the image block to be processed adjacent to the image block to be processed which has completed filtering processing;
  • 017 Re-determine the first filtering area of the current image block to be processed according to the sub-filtering radius and calculation area of the current filtering layer;
  • the image processing device 10 further includes a third determining module 15 , a fourth determining module 16 and a fifth determining module 17 .
  • the third determination module 15 , the fourth determination module 16 , the fifth determination module 17 and the second acquisition module 13 are respectively used to execute step 015 , step 016 , step 017 and step 0131 .
  • the third determination module 15 is used to determine the output area of the filter region of the current filter layer according to the sub-filter radius of the filter layer after the current filter layer; the fourth determination module 16 is used to determine the output area of the filter area of the current filter layer; The output area, and the output area of the image block to be processed that has completed the filtering process adjacent to the image block to be processed currently, determines the calculation area of the image block to be processed currently; the fifth determination module 17 is used for sub-filtering according to the current filter layer Radius, and calculation area, re-determine the first filter area of the current image block to be processed; the second acquisition module 17 is used to acquire the set of the first image data and the second image data, and the re-determined first filter area corresponding image data.
  • the processor 30 is further configured to determine the output area of the filtering area of the current filtering layer according to the sub-filtering radius of the filtering layer after the current filtering layer; according to the position information and the output area of the current image block to be processed, And the output area of the image block to be processed that has completed the filtering process adjacent to the image block to be processed currently, determine the calculation area of the image block to be processed currently; Processing the first filtering area of the image block; obtaining the image data corresponding to the re-determined first filtering area in the set composed of the first image data and the second image data. That is to say, step 015 , step 016 , step 017 and step 0131 can be implemented by the processor 30 .
  • the filtering radius of the current filtering layer is determined according to the sub-filtering radius of the current filtering layer and the sub-filtering radius of the filtering layer after the current filtering layer.
  • the sub-filter radius of the current filter layer is R1
  • the sum of the sub-filter radii of all filter layers after the current filter layer is R2.
  • the dotted line frame represents the filtering area S1
  • the dotted line frame represents the output area S2.
  • the filtering frame during filtering processing is generally determined according to the sub-filtering radius of the current layer (such as R1), which is generally (2R1+1)*(2R1+1). Therefore, the image data used by each pixel in the overlapped part when filtering is located in the overlapped area of the filtering areas of the two image blocks to be processed. Therefore, when the two image blocks to be processed are respectively subjected to filter processing, the image data in the filter frame when each pixel of the overlapped portion is subjected to filter processing is the same, and the two image blocks to be processed are respectively filtered. , the filtered data in the overlapped portion is also the same, therefore, the filtered data in the overlapped portion can be multiplexed by the two image blocks to be processed.
  • R1 sub-filtering radius of the current layer
  • the filtered data of the overlapping part can be output. Therefore, when the other one of the two image blocks to be processed is filtered, the overlapping part may not be repeated. Filtering, but only needs to be calculated for the area outside the overlapped part of the output area of the image block to be processed.
  • the processor 30 can determine the calculation area of the current image block to be processed according to the output area of the current image block to be processed and the output area of the image block to be processed adjacent to the current image block to be processed.
  • the image blocks to be processed adjacent to the image block to be processed need to have been filtered, so as to ensure that the current image block to be processed can reuse the filtered data of the overlapped part as the final output filter data of the current image block to be processed.
  • the overlapping portion C2 has a width offset of 2*R2 relative to the output area S2 corresponding to the right image block A1 to be processed.
  • the position of the image block A1 to be processed in the image to be processed is different, and the position and quantity of the adjacent image block A1 to be processed without filtering processing are also different, but the principle of determining the calculation area is basically the same, and will not be repeated here repeat.
  • the processing may be performed line by line from left to right according to the lines formed by the image blocks to be processed.
  • the image to be processed is divided into 9 image blocks to be processed, which are arranged in 3*3.
  • the image block to be processed in the upper left corner can be processed first, and then the image to be processed is processed to the right block until the first row is processed, and then process the second row from the leftmost of the second row.
  • the processing may be performed column by column from top to bottom according to the columns of the image blocks to be processed, and no limitation is set here.
  • the processing is performed from left to right, row by column composed of the image blocks to be processed as an example.
  • the processor 30 may acquire the location information of the current image block to be processed to further determine the image block to be processed adjacent to the current image block to be processed which has undergone filtering processing, so as to realize the multiplexing of the filtered data of overlapping parts.
  • the position information includes at least one of the upper edge, the left edge and the middle position of the image to be processed.
  • the position information of the current image to be processed includes the upper edge and the left edge, that is, the current image to be processed is located at the upper edge and the left edge at the same time;
  • the position information of the current image to be processed only includes the left edge, that is, the current image to be processed is only located at the upper edge, but not at the left edge Left edge;
  • the position information of the current image to be processed only includes the middle position, that is, the current image to be processed is not located at the top edge or the left edge.
  • the position information when determining the calculation area, if the position information includes the upper edge, it means that there are adjacent image blocks to be processed on the left, right and lower sides of the current image block to be processed. Since the order of the filtering processing of the image blocks to be processed is row-by-row processing from left to right, only the image blocks to be processed on the left side of the current image block to be processed have completed the filtering processing.
  • the processor 30 can determine the calculation area according to the output area of the current image block to be processed and the output area of the image block to be processed adjacent to the left side of the current image block to be processed, for example, the output area of the current image block to be processed In , the area outside the output area of the image block to be processed adjacent to the left of the current image block to be processed is used as the calculation area.
  • the processor 30 can determine the calculation area according to the output area of the current image block to be processed and the output area of the image block to be processed adjacent to the top of the current image block to be processed, for example, the output area of the current image block to be processed In , the area outside the output area of the image block to be processed adjacent to the upper side of the current image block to be processed is used as the calculation area.
  • the processor 30 directly uses the output area of the current image block to be processed as the calculation area.
  • the position information includes the middle position, it means that there are adjacent image blocks to be processed on the upper side, left side, right side and lower side of the current image block to be processed. Since the order of the filtering processing of the image blocks to be processed is row-by-row processing from left to right, the image blocks to be processed above and to the left of the current image block to be processed have completed the filtering processing.
  • the processor 30 can determine the calculation area according to the output area of the current image block to be processed and the output areas of the two image blocks to be processed adjacent to the upper side and the left side of the current image block to be processed, for example, the current image block to be processed In the output area of the image block, the area outside the output area of the image block to be processed adjacent to the upper side of the current image block to be processed and the output area of the image block to be processed adjacent to the left is used as the calculation area.
  • the filter area of the current image block to be processed can be re-filtered, and the filter area of the calculation area can be output after filtering the re-determined filter area. data.
  • the calculation area of adjacent unfiltered image blocks to be processed is determined by the output area of the unfiltered image block to be processed, so that the overlapping part of the output area of the unfiltered image block to be processed
  • the filtered data can be multiplexed by adjacent unfiltered image blocks to be processed, and only need to realize the calculation of the filtered data in the calculation area, thereby reducing the calculation amount of the filtering process and improving the filtering efficiency.
  • the filtering radius includes the fifth filtering radius and the sixth filtering radius
  • the image processing method also includes the following steps:
  • the image processing device 10 further includes a filter module 18 , a third acquisition module 19 and an output module 20 .
  • the filtering module 18 , the third obtaining module 19 and the output module 20 are respectively used to execute step 018 , step 019 and step 020 . That is, the filtering module 18 is used to filter the image data corresponding to the re-determined first filtering area to obtain the first filtering data; the third obtaining module 19 is used to obtain the image data corresponding to the current image block to be processed according to the position information of the current image block to be processed.
  • the output module 20 is used to , output the filtered data of the output area of the current image block to be processed.
  • the processor 30 is further configured to filter the image data corresponding to the re-determined first filtering area to obtain the first filtering data; according to the position information of the image block currently to be processed, obtain Process the second filter data of the overlapping part of the output area of the image block to be processed adjacent to the image block that overlaps with the output area of the current image block to be processed; output the current image to be processed according to the first filter data and the second filter data
  • the filtered data for the output region of the block That is to say, step 018 , step 019 and step 020 may be implemented by the processor 30 .
  • the output area includes the calculation area
  • the output area of the current image block to be processed is connected to the adjacent already The overlapping part of the output area of the image block to be processed that undergoes filtering processing.
  • the processor 30 After the processor 30 performs filtering processing on the re-determined filtering area, it can obtain the first filtering data of the calculation area, and then the processor 30 directly obtains the image block adjacent to the current image block to be processed that has undergone filtering processing.
  • the processor 30 By processing the second filtered data of the overlapped part of the image block, the filtered data of the output area of the current image block to be processed can be obtained, wherein, the overlapped part of the image block to be processed that is adjacent to the current image block to be processed It refers to the overlapped part in the output area of the current image block to be processed among the image blocks to be processed adjacent to the image block to be processed that have been filtered.
  • the processor 30 outputs the first image data and the second image data, that is, the filtered data of the output area of the current image block to be processed.
  • the current image block to be processed completes the filtering process and outputs the filter data of the output area of the current image block to be processed, it can be determined according to the position information of the current image block to be processed but adjacent to the current image block to be processed
  • the image block to be processed without filter processing hereinafter referred to as the target image block.
  • the location information may include at least one of the upper edge, the left edge, and the middle position. Since the filter processing sequence of the image block to be processed is preset (for example, the image block to be processed is filtered row by row from top to bottom, and each When there is a row of image blocks to be processed, filter processing is performed one by one from left to right), therefore, according to the position information, it can quickly determine the image blocks to be processed that have undergone filtering processing among one or more image blocks to be processed adjacent to the current image block to be processed image blocks, and image blocks to be processed without filtering.
  • the location information includes the middle position, it can be determined that the two image blocks adjacent to the upper side and the adjacent left side of the current image block to be processed have been filtered, and the two adjacent image blocks adjacent to the lower side and right side of the current image block to be processed are filtered. The image blocks to be processed have not been filtered.
  • the filter data of the overlapped part of the output area of the current image block to be processed and the output area of the target image block is stored in the transfer memory provided in the memory 60, for the target image block to carry out filtering processing, and the calculation of the target image block
  • the filtered data of the region are used together as the filtered data of the output region of the target image block, so as to realize the multiplexing of the output data.
  • the image processing method also includes:
  • 021 Re-determine the filter radius according to the preset down-sampling parameters, and perform filter processing on the image block to be processed.
  • the image processing device 10 further includes a sixth determination module 21 .
  • the sixth determination module 21 is used to execute step 021 . That is, the sixth determination module 21 is configured to re-determine the filter radius according to the preset down-sampling parameters, and perform filter processing on the image block to be processed.
  • the processor 30 is further configured to re-determine the filtering radius according to preset down-sampling parameters, and perform filtering processing on the image block to be processed. That is to say, step 021 may be implemented by the processor 30 .
  • the processor 30 can perform filtering processing according to the preset down-sampling parameters, such as the image to be processed can be separated by a predetermined number of pixels one by one The pixels of the block are processed.
  • the downsampling parameter is 1/2, it means that when performing filtering processing, the pixels of the image block to be processed can be processed one by one at intervals of 1, so that the number of pixels undergoing filtering processing is reduced to 1/2 of the original, reducing the amount of filtering processing.
  • the image block to be processed after the filtering is completed can be up-sampled, so that the pixels that have not been filtered are processed according to the pixels that have been filtered in the image block to be processed after the filtering is completed. Interpolation, to quickly obtain the pixel value of the pixel that has not been filtered, while reducing the amount of filtering processing, while ensuring the filtering effect.
  • each pixel in order to ensure that each pixel can be normally filtered when the filtering process is performed according to the downsampling parameters, it is necessary to re-determine the filtering radius of the current filtering layer according to the downsampling parameters. For example, when the down-sampling parameter is 1/2, the sub-filter radius of the current filter layer can be expanded by 2 times, and when the down-sampling parameter is 1/3, the sub-filter radius of the current filter layer can be expanded by 3, thereby re-determining the current filter The filter radius of the layer. In this way, it can be ensured that when filtering is performed according to the downsampling parameters, each pixel can be normally filtered.
  • the image to be processed when the size of the image to be processed is small and it is not easy to perform filtering processing, the image to be processed can only be up-sampled to improve the filtering effect, and the processor 30 can re-determine the filtering radius according to the preset up-sampling parameters , and filter the image block to be processed.
  • the upsampling parameter is 2
  • the sub-filter radius of the current filter layer can be reduced by the up-sampling parameter/2+1 (that is, 2), and when the up-sampling parameter is 4, the sub-filter radius of the current filter layer can be reduced by Decrease the upsampling parameter/2+1 (ie, 3), thereby re-determining the filtering radius of the current filtering layer.
  • the up-sampling parameter/2+1 that is, 2
  • the up-sampling parameter 4 the sub-filter radius of the current filter layer can be reduced by Decrease the upsampling parameter/2+1 (ie, 3), thereby re-determining the filtering radius of the current filtering layer.
  • a non-volatile computer-readable storage medium 300 storing a computer program 302 according to an embodiment of the present application.
  • the processors 30 can execute The image processing method of any one of the above embodiments.
  • the processors 30 are made to perform the following steps:
  • the first image data includes the image data of the area outside the overlapping area in the filtering area of the image block to be processed
  • the second image data includes the image data of the overlapping area of the image block to be processed The image data, wherein, when the filtering process is performed, the second image data is multiplexed into the corresponding two adjacent image blocks to be processed in the same overlapping area.
  • processors 30 may also perform the following steps:
  • 014 Determine the filtering radius of the current filtering layer according to the preset sub-filtering radius of the current filtering layer and the preset sub-filtering radius of the filtering layer after the current filtering layer.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理方法、图像处理装置(10)、终端(100)和存储介质(300)。方法包括(011)根据滤波半径确定滤波区域;(012)获取当前待处理图像块的第一滤波区域和相邻待处理图像块的第二滤波区域的重合区域;(013)获取第一滤波区域中重合区域之外第一图像数据和重合区域的第二图像数据,第二图像数据供相邻待处理图像块使用。

Description

图像处理方法、装置、终端和可读存储介质
优先权信息
本申请请求2022年2月10日向中国国家知识产权局提交的、专利申请号为2022101260246的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及图像技术领域,特别涉及一种图像处理方法、图像处理装置、终端和非易失性计算机可读存储介质。
背景技术
目前,在对图像进行处理时,会对图像进行分块处理,在对图像块进行处理时,会获取每个图像块对应的处理区域内的图像数据。
发明内容
本申请的实施方式提供了一种图像处理方法、图像处理装置、终端和非易失性计算机可读存储介质。
本申请实施方式的图像处理方法包括根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
本申请实施方式的图像处理装置包括第一确定模块、第一获取模块和第二获取模块。所述第一确定模块用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;所述第一获取模块用于获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及所述第二获取模块用于获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
本申请实施方式的终端包括处理器,所述处理器用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
本申请的一种包含计算机程序的非易失性计算机可读存储介质,当所述计算机程序被一个或多个处理器执行时,使得所述处理器执行图像处理方法。所述图像处理方法包括根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请某些实施方式的图像处理方法的流程示意图;
图2是本申请某些实施方式的图像处理装置的模块示意图;
图3是本申请某些实施方式的终端平面示意图;
图4至图7是本申请某些实施方式的原理示意图;
图8和图9是本申请某些实施方式的图像处理方法的流程示意图;
图10是本申请某些实施方式的原理示意图;
图11和图12是本申请某些实施方式的图像处理方法的流程示意图;
图13本申请某些实施方式的处理器和计算机可读存储介质的连接示意图。
具体实施方式
以下结合附图对本申请的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。另外,下面结合附图描述的本申请的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的限制。
本申请的图像处理方法包括:根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。
在某些实施方式中,滤波层包括多层,待处理图像依次经过多层滤波层的滤波处理,图像处理方法还包括:根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。
在某些实施方式中,图像处理方法还包括:根据当前滤波层之后的滤波层的子滤波半径,确定当前滤波层的滤波区域的输出区域;根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域;根据当前滤波层的子滤波半径、及计算区域,重新确定第一滤波区域;获取第一图像数据和第二图像数据,包括:获取第一图像数据和第二图像数据的组成的集合中,与重新确定的第一滤波区域对应的图像数据。
在某些实施方式中,位置信息包括待处理图像的第一边缘、第二边缘和中间位置中至少一个,根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域,包括:在位置信息包括第一边缘时,根据当前待处理图像块的输出区域、及与当前待处理图像块的第一边缘相邻的待处理图像块的输出区域,确定计算区域;在位置信息包括第二边缘时,根据当前待处理图像块的输出区域、及与当前待处理图像块的第二边缘相邻的待处理图像块的输出区域,确定计算区域;在位置信息包括第一边缘和第二边缘时,根据当前待处理图像块的输出区域,确定计算区域;在位置信息为中间位置时,根据当前待处理图像块的输出区域、及与当前待处理图像块的第一边缘和第二边缘分别相邻的两个待处理图像块的输出区域,确定计算区域。
在某些实施方式,图像处理方法还包括:对重新确定的第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;根据当前待处理图像块的位置信息,获取与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域中,与当前待处理图像块的输出区域重合的重合部分的第二滤波数据;根据第一滤波数据和第二滤波数据,输出当前待处理图像块的输出区域的滤波数据。
在某些实施方式中,滤波半径包括第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径均相同;或者第一滤波半径和第三滤波半径相同,第二滤波半径和第四滤波半径相同;或者,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径互不相同。
在某些实施方式中,图像处理方法还包括在完成对待处理图像块的滤波处理后,存储第二图像数据到内存。
在某些实施方式中,图像处理方法还包括根据预设的下采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。
在某些实施方式中,图像处理方法还包括在当前待处理图像块完成滤波处理并输出当前待处理图像块的输出区域的滤波数据后,根据当前待处理图像块的位置信息确定与当前待处理图像块相邻且未进行滤波处理的待处理图像块。
本申请的图像处理装置包括第一确定模块、第一获取模块、和第二获取模块。第一确定模块用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;第一获取模块用于获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;第二获取模块用于获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。
本申请的终端包括处理器,处理器用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。
在某些实施方式中,滤波层包括多层,待处理图像依次经过多层滤波层的滤波处理,处理器还用于根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。
在某些实施方式中,处理器还用于根据当前滤波层之后的滤波层的子滤波半径,确定当前滤波层的滤波区域的输出区域;根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域;根据当前滤波层的子滤波半径、及计算区域,重新确定第一滤波区域;获取第一图像数据和第二图像数据的组成的集合中,与重新确定的第一滤波区域对应的图像数据。
在某些实施方式中,位置信息包括待处理图像的第一边缘、第二边缘和中间位置中至少一个,处理器还用于在位置信息包括第一边缘时,根据当前待处理图像块的输出区域、及与当前待处理图像块的第一边缘相邻的待处理图像块的输出区域,确定计算区域;在位置信息包括第二边缘时,根据当前待处理图像块的输出区域、及与当前待处理图像块的第二边缘相邻的待处理图像块的输出区域,确定计算区域;在位置信息包括第一边缘和第二边缘时,根据当前待处理图像块的输出区域,确定计算区域;在位置信息为中间位置时,根据当前待处理图像块的输出区域、及与当前待处理图像 块的第一边缘和第二边缘分别相邻的两个待处理图像块的输出区域,确定计算区域。
在某些实施方式中,处理器还用于对重新确定的第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;根据当前待处理图像块的位置信息,获取与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域中,与当前待处理图像块的输出区域重合的重合部分的第二滤波数据;根据第一滤波数据和第二滤波数据,输出当前待处理图像块的输出区域的滤波数据。
在某些实施方式中,滤波半径包括第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径均相同;或者第一滤波半径和第三滤波半径相同,第二滤波半径和第四滤波半径相同;或者,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径互不相同。
在某些实施方式中,处理器还用于在完成对待处理图像块的滤波处理后,存储第二图像数据到内存。
在某些实施方式中,处理器还用于根据预设的下采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。
在某些实施方式中,处理器还用于在当前待处理图像块完成滤波处理并输出当前待处理图像块的输出区域的滤波数据后,根据当前待处理图像块的位置信息确定与当前待处理图像块相邻且未进行滤波处理的待处理图像块。
本申请的非易失性计算机可读存储介质包括计算机程序,计算机程序被处理器执行时,使得处理器执行上述任一实施方式的图像处理方法。
请参阅图1至图3,本申请实施方式的图像处理方法包括以下步骤:
011:根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;
012:获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及
013:获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。
本申请实施方式的图像处理装置10包括第一确定模块11、第一获取模块12和第二获取模块13。第一确定模块11、第一获取模块12和第二获取模块13分别用于执行步骤011、步骤012和步骤013。即,第一确定模块11用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;第一获取模块12用于获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;第二获取模块13用于获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。
本申请实施方式的终端100包括处理器30。处理器30用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,第一图像数据包括第一滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括重合区域的图像数据,第二图像数据供相邻的待处理图像块进行滤波时使用。也即是说,步骤011、步骤012和步骤013可以由处理器30实现。
具体地,终端100还包括壳体40。终端100可以是手机、平板电脑、显示设备、笔记本电脑、柜员机、闸机、智能手表、头显设备、游戏机等。如图3所示,本申请实施方式以终端100是手机为例进行说明,可以理解,终端100的具体形式并不限于手机。壳体 40还可用于安装终端100的显示装置、成像装置、供电装置、通信装置等功能模块,以使壳体40为功能模块提供防尘、防摔、防水等保护。
待处理图像可以是终端100的相机20拍摄的图像,也可以是从网上下载的图像,在此不作限制。待处理图像还可以是相机20拍摄图像的一部分。待处理图像还可以是深度图像。
在对待处理图像进行滤波时,一般会基于预设的滤波算法进行。预设滤波算法可确定待处理图像的预设尺寸、滤波层的预设个数、以及每个滤波层的滤波半径。
待处理图像包括第一边缘、第二边缘、第三边缘和第四边缘,第一边缘和第三边缘相对,第二边缘和第四边缘相对。第一边缘可以是上边缘,第二边缘可以是左边缘;或者,第一边缘可以是下边缘,第二边缘可以是右边缘;或者,第一边缘可以是上边缘,第二边缘可以是右边缘等。以下以第一边缘可以是上边缘,第二边缘可以是左边缘为例进行说明。
在滤波时,若待处理图像的尺寸大于预设尺寸,则可通过预设尺寸将待处理图像分为多个待处理图像块,从而使得处理器30可直接获取每个待处理图像块的位置信息,如待处理图像块的顶点坐标(以待处理图像块为矩形为例)、待处理图像块在待处理图像中的位置(如待处理图像块位于左边缘、上边缘、同时位于左边缘和上边缘(即左上角))。同时,处理器30还能够直接获取到每个滤波层的滤波半径。
然后,处理器30根据当前滤波层的滤波半径,即可确定每个待处理图像块的滤波区域。例如,滤波半径为与待处理图像块的边缘的距离。在确定每个待处理图像块的滤波区域时,可按照滤波半径在待处理图像块的边缘进行扩展,从而确定滤波区域。当然,为了确定滤波区域在待处理图像中的位置,可根据待处理图像块的位置信息(如待处理图像块的顶点坐标)和滤波半径来确定滤波区域。
其中,处理器30可根据待处理图像块的顶点坐标及滤波半径,确定滤波区域的顶点坐标,在一个例子中,请参阅图4,以待处理图像块A1的左上角为原点建立图像坐标系,待处理图像块A1的宽度为8(沿W方向的像素数),高度为8(沿H方向的像素数),则待处理图像块A1的顶点坐标分别为(0,0),(8,0),(0,8)和(8,8),如滤波半径为2个像素,在待处理图像块A1的上边缘和下边缘分别增加两行像素,左边缘和右边缘分别增加两列像素,则待处理图像块S1的顶点坐标分别为(-2,-2),(10,2),(-2,10)和(10,10)。如此,根据待处理图像块A1的位置信息和每个滤波层的滤波半径,即可快速确定得到每个滤波层的待处理图像块S1。
在其他实施方式中,滤波半径包括第一滤波半径(如对应图像的上边)、第二滤波半径(如对应图像的右边)、第三滤波半径(如对应图像的下边)和第四滤波半径(如对应图像的左边),第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径均相同(如图4所示的例子);或者第一滤波半径和第三滤波半径相同,第二滤波半径和第四滤波半径相同;或者,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径互不相同。
在另一个例子中,请参阅图5,以待处理图像块A1的左上角为原点建立图像坐标系,则待处理图像块A1的顶点坐标分别为(0,0),(8,0),(0,8)和(8,8),如第一滤波半径和第三滤波半径相同且均为2个像素,第二滤波半径和第四滤波半径相同且均为1个像素,则在待处理图像块A1的上边缘和下边缘分别增加两行像素,左边缘和右边缘分别增加一列像素,则待处理图像块S1的顶点坐标分别为(-1,-2),(9,-2),(-1,10)和(9,10)。如此,根据待处理图像块A1的位置信息和每个滤波层的滤波半径,即可快速确定得到每个滤波层的滤波区域S1。
在再一个例子中,请参阅图6,以待处理图像块A1的左上角为原点建立图像坐标系,则待处理图像块A1的顶点坐标分别为(0,0),(8,0),(0,8)和(8,8),如第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径分别为4个像素、3个像素、2 个像素、1个像素,则在待处理图像块A1的上边缘和下边缘分别增加4行像素和2行像素,左边缘和右边缘分别增加1列像素和3列像素,则待处理图像块S1的顶点坐标分别为(-1,-4),(11,-4),(-1,10)和(11,10)。如此,根据待处理图像块A1的位置信息和每个滤波层的滤波半径,即可快速确定得到每个滤波层的待处理图像块S1。
可以理解,在进行滤波处理时,需要获取每个待处理图像块的滤波区域内的所有图像数据。由于滤波区域大于待处理图像块所在的区域,在确定每个待处理图像块的滤波区域,即可确定相邻的待处理图像块的滤波区域之间的重合区域。请参阅图6,对于相邻的待处理图像块A1,两者的滤波区域S1存在重合区域C1,需要注意的是,图6中的方格仅用于示意像素,但并不能限定为一个方格仅为1个像素。
在现有的技术方案中,对于重合区域内的图像数据(即第二图像数据),在对相邻的两个待处理图像块进行滤波处理时,处理区域一般会大于图像块本身所在的区域,图像处理所需要获取的图像数据量较大。如该两个待处理图像块除了均需要各自从存储器50(如动态随机存取存储器50,DRAM)中读取重合区域之外的区域的图像数据(即,第一图像数据)之外,还均需要各自从终端的存储器50中读取一次第二图像数据到内存60(如终端100的处理芯片上面的紧耦合内存(Tightly Coupled Memories,TCM)或视觉处理单元(Vision Processing Unit,VPU)的内存等)。即,要对该两个待处理图像块进行滤波处理,第二图像数据需要从存储器50读取两次第二图像数据,即总共需要读取的数据量为2*(第一图像数据+第二图像数据)。本实施方式中,在读取了第二图像数据以对待处理图像块进行滤波处理后,可将第二图像数据存储在VPU的一级缓存中,从而方便相邻的未进行滤波的待处理图像块在进行滤波时使用,实现第二图像数据复用时能够更快地被读写,提升滤波效率且功耗较低。
需要注意的是,不同的待处理图像块的第一图像数据可以是不同的,这里的读取的数据量仅为示意,并不能限定为相邻的两个待处理图像块的第一图像数据相同。
而本申请在确定相邻的两个待处理图像块的重合区域后,在获取当前待处理图像块的第一图像数据和第二图像数据后,将第二图像数据给与当前待处理图像块相邻且未进行滤波的待处理图像块复用,即仅需从存储器50中读取一次第二图像数据到内存60,以实现相邻两个待处理图像块进行滤波处理,总共需要读取的数据量为2*第一图像数据+第二图像数据。也即是说,采用本申请的技术方案,每两个相邻的两个待处理图像块均可以减少一个第二图像数据的读取,从而减少了滤波处理的读取的数据量,以提高滤波效率。
在当前滤波层按照预设的滤波顺序(如从上到下逐行滤波处理待处理图像块,且在处理每一行待处理图像块时,从左到右逐个进行滤波处理),完成一个待处理图像块的滤波处理后,处理器30可存储该待处理图像块的第二图像数据到内存60,以供与该待处理图像相邻且未进行滤波处理的待处理图像块进行滤波处理。
具体为,可在内存60中单独设置中转内存(在VPU的一级缓存中确定一部分存储空间作为中转内存),在当前待处理图像块获取了第一滤波区域内的图像数据后,将第一滤波区域中的重合区域的第二图像数据存在中转内存中,以供与当前待处理图像块相邻的待处理图像块进行滤波处理,从而使得与当前待处理图像块相邻的待处理图像块只需从存储器50中读取第二滤波区域中,重合区域之外的第一图像数据即可。
其中,对于滤波区域存在的图像数据可直接获取(如滤波区域与待处理图像重合的部分),而对于滤波区域不存在图像数据的部分(滤波区域与待处理图像不重合的部分),需要根据滤波区域内已存在的图像数据进行填充,从而获取到滤波区域内每个像素的图像数据。
本申请实施方式的图像处理方法、图像处理装置10和终端100,首先根据当前滤波层的滤波半径,来确定待处理图像中,预先分割好的待处理图像块的滤波区域,滤波区域的面积大于待处理图像块,相邻的待处理图像块的滤波区域之间存在重合区域,因此,在 对当前待处理图像块进行滤波处理时,获取当前待处理图像块的第一滤波区域中,重合区域之外的区域的图像数据(即,第一图像数据),以及当前待处理图像块的重合区域的图像数据(即,第二图像数据),由于第二图像数据为第一滤波区域与相邻的待处理图像块的第二滤波区域之间的重合区域的图像数据,因此,第二图像数据可以被对应相邻的待处理图像块复用,相较于每个待处理图像块均获取其对应滤波区域的图像数据,第二图像数据需要从读取两次而言,复用当前待处理图像块的第二图像数据,可只需读取一次第二图像数据即可,从而降低了滤波处理所需数据量读写量,可提升滤波效率。
请参阅图2、图3和图7,在某些实施方式中,滤波层包括多层,待处理图像依次经过多层滤波层的滤波处理,图像处理方法还包括:
014:根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。
在某些实施方式中,图像处理装置10还包括第二确定模块14。第二确定模块14用于执行步骤014。即,第二确定模块14用于根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。
在某些实施方式中,处理器30还用于根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。也即是说,步骤0121和步骤0122可以由处理器30实现。
具体地,滤波层包括依次对待处理图像进行处理的多层,如为1、2、3、4、5等。以滤波层包括2层为例,待处理图像块的滤波区域的图像数据经过第1层和第2层依次处理后,即可完成滤波,以生成滤波图像。
每一层滤波层均存在预设的子滤波半径,在确定当前滤波层的滤波半径时,处理器30可根据当前滤波层预设的子滤波半径、及当前滤波层之后的滤波层的子滤波半径,来确定当前滤波层的滤波半径。
例如,滤波层包括3层,第一层的子滤波半径为r1、第二层的子滤波半径为r2、第三层的子滤波半径为r3,处理器30可根据当前滤波层预设的子滤波半径、及当前滤波层之后的所有滤波层的子滤波半径之和,作为当前滤波层的滤波半径,即当前滤波层的滤波半径=r1+r2+r3。
请参阅图2、图3和图8,在某些实施方式中,图像处理方法还包括以下步骤:
015:根据当前滤波层之后的滤波层的子滤波半径,确定当前滤波层的滤波区域的输出区域;
016根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域;
017:根据当前滤波层的子滤波半径、及计算区域,重新确定当前待处理图像块的第一滤波区域;
步骤013,包括:
0131:获取第一图像数据和第二图像数据的组成的集合中,与重新确定的第一滤波区域对应的图像数据。
在某些实施方式中,图像处理装置10还包括第三确定模块15、第四确定模块16和第五确定模块17。第三确定模块15、第四确定模块16、第五确定模块17和第二获取模块13分别用于执行步骤015、步骤016、步骤017和步骤0131。即,第三确定模块15用于根据当前滤波层之后的滤波层的子滤波半径,确定当前滤波层的滤波区域的输出区域;第四确定模块16用于根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域;第五确定模块17用于根据当前滤波层的子滤波半径、及计算区域,重 新确定当前待处理图像块的第一滤波区域;第二获取模块17用于获取第一图像数据和第二图像数据的组成的集合中,与重新确定的第一滤波区域对应的图像数据。
在某些实施方式中,处理器30还用于根据当前滤波层之后的滤波层的子滤波半径,确定当前滤波层的滤波区域的输出区域;根据当前待处理图像块的位置信息和输出区域、及与当前待处理图像块相邻的已完成滤波处理的待处理图像块的输出区域,确定当前待处理图像块的计算区域;根据当前滤波层的子滤波半径、及计算区域,重新确定当前待处理图像块的第一滤波区域;获取第一图像数据和第二图像数据的组成的集合中,与重新确定的第一滤波区域对应的图像数据。也即是说,步骤015、步骤016、步骤017和步骤0131可以由处理器30实现。
具体地,当前滤波层的滤波半径根据当前滤波层的子滤波半径和当前滤波层之后的滤波层的子滤波半径确定。其中,当前滤波层的子滤波半径为R1、当前滤波层之后的所有滤波层的子滤波半径之和为R2,在当前滤波层进行滤波处理后,当前滤波层输出待处理图像块所在区域的滤波数据及R2对应的区域的滤波数据,即滤波区域的输出区域可根据待处理图像块和R2确定。
可以理解,请参阅图10,虚线框表示滤波区域S1,点划线框表示输出区域S2,相邻的待处理图像块A1的输出区域S2存在重合部分C2,在现有技术中,对相邻的两个待处理图像块A1滤波处理后,该两个待处理图像块A1均会输出该重合部分C2的滤波数据,从而导致该重合部分C2需要进行滤波计算两次,且输出两次,降低了滤波处理的效率。
而滤波处理时的滤波框一般根据当前层的子滤波半径确定(如R1),一般为(2R1+1)*(2R1+1)。因此,该重合部分的每个像素在进行滤波时使用的图像数据均位于该两个待处理图像块的滤波区域的重合区域内。因此,在该两个待处理图像块各自进行滤波处理时,该重合部分的每个像素进行滤波处理时的滤波框内的图像数据是相同的,该两个待处理图像块各自进行滤波处理后,该重合部分内的滤波数据也是相同的,因此,该重合部分内的滤波数据能够被该两个待处理图像块复用。
在对该两个待处理图像块的其中一个进行滤波处理后,即可输出重合部分的滤波数据,因此,在对该两个待处理图像块的另外一个进行滤波处理,可不对重合部分进行重复滤波,而是只需要对待处理图像块的输出区域中重合部分之外的区域进行计算即可。
因此,处理器30根据当前待处理图像块的输出区域、及与当前待处理图像块相邻的待处理图像块的输出区域,即可确定当前待处理图像块的计算区域,其中,与当前待处理图像块相邻的待处理图像块需要已进行滤波处理,才能够保证当前待处理图像块能够复用重合部分的滤波数据,以作为当前待处理图像块最终输出的滤波数据。
例如,对于待处理图像块A1而言,其宽度为TW,高度为TH,对于图10中左边的待处理图像块A1进行滤波处理后,会输出该待处理图像块A1对应的输出区域S2内的滤波数据,输出区域S2的宽度为TW+2*R2,高度为TH+2*R2,其中,重合部分C2的宽度为2*R2,高度为TH+2*R2。因此,重合部分C2相对于右边的待处理图像块A1对应的输出区域S2而言,存在2*R2的宽度偏移,因此,在确定右边的待处理图像块A1对应的输出区域S2中,需要进行计算的计算区域S3(如图10中的填充部分)时,根据2*R1的宽度偏移,即可快速确定右边的待处理图像块A1对应的输出区域S2中的计算区域S3。
当然,可以理解,在待处理图像不能被分割为整数的待处理图像块A1时,可能存在尺寸不是TW*TH的图像块,在处理时,则可根据相邻的待处理图像块A1实际的宽度和长度来准确地确定可进行输入复用的重合区域C1,及进行输出复用的重合部分C2,从而保证复用的准确性。
且待处理图像块A1在待处理图像中的位置不同,其相邻的未进行滤波处理的待处理图像块A1的位置和数量也是不同的,但确定计算区域的原理基本相同,在此不再赘述。
此外,在对待处理图像的所有待处理图像块进行滤波处理时,可按照从左到右,按照待处理图像块组成的行逐行进行处理。例如,请再次参阅图7,待处理图像分为9个待处理图像块,呈3*3排列,在进行滤波处理时,可先处理左上角的待处理图像块,然后向右处理待处理图像块,直至第一行处理完成,再从第二行的最左侧处理第二行。当然,在对待处理图像的所有待处理图像块进行滤波处理时,可按照从上到下,按照待处理图像块组成的列逐列进行处理,在此不作限制。本实施方式中,以对待处理图像的所有待处理图像块进行滤波处理时,按照从左到右,按照待处理图像块组成的行逐列进行处理的方式为例进行说明。
因此,处理器30可获取当前待处理图像块的位置信息,来进一步确定与当前待处理图像块相邻的已进行滤波处理的待处理图像块,从而实现重合部分的滤波数据的复用。如位置信息包括待处理图像的上边缘、左边缘和中间位置中至少一个。在当前待处理图像块为左上角时,即可确定当前待处理图像的位置信息包括上边缘和左边缘,即当前待处理图像同时位于上边缘和左边缘;而在当前待处理图像的位置信息仅包括上边缘时,即当前待处理图像位于上边缘,但并未位于左边缘;在当前待处理图像的位置信息仅包括左边缘时,即当前待处理图像仅位于上边缘,但并未位于左边缘;在当前待处理图像的位置信息仅包括中间位置时,即当前待处理图像并未位于上边缘或左边缘。
因此,在确定计算区域时,若位置信息包括上边缘时,则表示当前待处理图像块的左侧、右侧和下侧存在相邻的待处理图像块。由于待处理图像块的滤波处理的顺序为从左到右逐行处理,故仅当前待处理图像块的左侧的待处理图像块已完成滤波处理。因此,处理器30可根据当前待处理图像块的输出区域、及与当前待处理图像块的左边相邻的待处理图像块的输出区域,确定计算区域,例如将当前待处理图像块的输出区域中,位于当前待处理图像块的左边相邻的待处理图像块的输出区域之外的区域作为计算区域。
若位置信息包括左边缘时,则表示当前待处理图像块的上侧、右侧和下侧存在相邻的待处理图像块。由于待处理图像块的滤波处理的顺序为从左到右逐行处理,故仅当前待处理图像块的上侧的待处理图像块已完成滤波处理。因此,处理器30可根据当前待处理图像块的输出区域、及与当前待处理图像块的上边相邻的待处理图像块的输出区域,确定计算区域,例如将当前待处理图像块的输出区域中,位于当前待处理图像块的上边相邻的待处理图像块的输出区域之外的区域作为计算区域。
若位置信息包括左边缘和上边缘时,则表示当前待处理图像块位于待处理图像的左上角,因此,当前待处理图像块为第一个进行滤波处理的图像块,此时当前待处理图像块无可复用的滤波数据。因此,处理器30则将当前待处理图像块的输出区域直接作为计算区域。
若位置信息包括中间位置时,则表示当前待处理图像块的上侧、左侧、右侧和下侧均存在相邻的待处理图像块。由于待处理图像块的滤波处理的顺序为从左到右逐行处理,故当前待处理图像块的上侧和左侧的待处理图像块已完成滤波处理。因此,处理器30可根据当前待处理图像块的输出区域、及与当前待处理图像块的上边和左边分别相邻的两个待处理图像块的输出区域,确定计算区域,例如将当前待处理图像块的输出区域中,位于当前待处理图像块的上边相邻的待处理图像块的输出区域、和左边相邻的待处理图像块的输出区域之外的区域作为计算区域。
然后,在确定计算区域后,根据当前滤波层的子滤波半径和计算区域,即可重新 当前待处理图像块的滤波区域,对重新确定的滤波区域进行滤波处理后,即可输出计算区域的滤波数据。
如此,通过已进行滤波处理的待处理图像块的输出区域来确定相邻的未进行滤波处理的待处理图像块的计算区域,使得已进行滤波处理的待处理图像块的输出区域中的重合部分的滤波数据能够被相邻的未进行滤波处理的待处理图像块复用,只需实现计算区域的滤波数据的计算即可,从而减少滤波处理的计算量,提高滤波效率。
请参阅图2、图3和图11,在某些实施方式中,滤波半径包括第五滤波半径和第六滤波半径,图像处理方法还包括以下步骤:
018:对重新确定的第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;
019:根据当前待处理图像块的位置信息,获取与当前待处理图像块相邻的待处理图像块的输出区域中,与当前待处理图像块的输出区域重合的重合部分的第二滤波数据;
020:根据第一滤波数据和第二滤波数据,输出当前待处理图像块的输出区域的滤波数据。
在某些实施方式中,图像处理装置10还包括滤波模块18、第三获取模块19和输出模块20。滤波模块18、第三获取模块19和输出模块20分别用于执行步骤018、步骤019和步骤020。即,滤波模块18用于对重新确定的第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;第三获取模块19用于根据当前待处理图像块的位置信息,获取与当前待处理图像块相邻的待处理图像块的输出区域中,与当前待处理图像块的输出区域重合的重合部分的第二滤波数据;输出模块20用于根据第一滤波数据和第二滤波数据,输出当前待处理图像块的输出区域的滤波数据。
在某些实施方式中,处理器30还用于对重新确定的第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;根据当前待处理图像块的位置信息,获取与当前待处理图像块相邻的待处理图像块的输出区域中,与当前待处理图像块的输出区域重合的重合部分的第二滤波数据;根据第一滤波数据和第二滤波数据,输出当前待处理图像块的输出区域的滤波数据。也即是说,步骤018、步骤019和步骤020可以由处理器30实现。
具体地,在对当前待处理图像块进行滤波后,需要输出当前待处理图像块的输出区域的滤波数据,其中,输出区域包括计算区域、及当前待处理图像块的输出区域与相邻的已进行滤波处理的待处理图像块的输出区域的重合部分。
因此,处理器30在对重新确定的滤波区域进行滤波处理后,即可获取到计算区域的第一滤波数据,然后处理器30直接获取与当前待处理图像块相邻的已进行滤波处理的待处理图像块的重合部分的第二滤波数据,即可得到当前待处理图像块的输出区域的滤波数据,其中,与当前待处理图像块相邻的已进行滤波处理的待处理图像块的重合部分指的是与当前待处理图像块相邻的已进行滤波处理的待处理图像块中,位于当前待处理图像块的输出区域内的重合部分。
最后,处理器30输出第一图像数据和第二图像数据,即为输出当前待处理图像块的输出区域的滤波数据。
在其他实施方式中,在当前待处理图像块完成滤波处理并输出当前待处理图像块的输出区域的滤波数据后,可根据当前待处理图像块的位置信息确定与当前待处理图像块相邻但未进行滤波处理的待处理图像块(下称目标图像块)。
例如,位置信息可包括上边缘、左边缘和中间位置中至少一个,由于预先设定了待处理图像块的滤波处理顺序(如从上到下逐行滤波处理待处理图像块,且在处理每一行待处理图像块时,从左到右逐个进行滤波处理),因此,根据位置信息可快速确 定当前待处理图像块相邻的一个或多个待处理图像块中,已进行滤波处理的待处理图像块,以及未进行滤波处理的待处理图像块。如位置信息包括中间位置,则可确定当前待处理图像块的上边相邻和左边相邻的两个待处理图像块已进行滤波处理,当前待处理图像块的下边相邻和右边相邻的两个待处理图像块未进行滤波处理。
从而将当前待处理图像块的输出区域和目标图像块的输出区域的重合部分的滤波数据存储在内存60中设置的中转内存中,以供目标图像块进行滤波处理后,与目标图像块的计算区域的滤波数据一起作为目标图像块的输出区域的滤波数据,从而实现输出数据的复用。
请参阅图2、图3和图12,在某些实施方式中,图像处理方法还包括:
021:根据预设的下采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。
在某些实施方式中,图像处理装置10还包括第六确定模块21。第六确定模块21用于执行步骤021。即,第六确定模块21用于根据预设的下采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。
在某些实施方式中,处理器30还用于根据预设的下采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。也即是说,步骤021可以由处理器30实现。
具体地,在待处理图像块的图像尺寸较大时,为了进一步提升滤波效率,处理器30可根据预设的下采样参数,来进行滤波处理,如可间隔预定个数的像素逐个对待处理图像块的像素进行处理。例如,下采样参数为1/2,则表示进行滤波处理时,可间隔1个的像素逐个对待处理图像块的像素进行处理,从而使得进行滤波处理的像素数降低为原来的1/2,降低了滤波处理量。
然后,在对待处理图像块滤波完成后,可对滤波完成后待处理图像块进行上采样,从而根据滤波完成后待处理图像块中,已进行滤波处理的像素来对未进行滤波处理的像素进行插值,快速得到未进行滤波处理的像素的像素值,在降低滤波处理量的同时,保证滤波效果。
其中,为了保证按照下采样参数进行滤波处理时,每个像素均能够被正常地进行滤波处理,需要根据下采样参数来重新确定当前滤波层的滤波半径。例如,下采样参数为1/2时,可将当前滤波层的子滤波半径扩大2倍,在下采样参数为1/3时,可将当前滤波层的子滤波半径扩大3,从而重新确定当前滤波层的滤波半径。如此,可保证按照下采样参数进行滤波处理时,每个像素均能够被正常地进行滤波处理。
在其他实施方式中,在待处理图像尺寸较小,不易进行滤波处理时,也可仅对待处理图像进行上采样,以提高滤波效果,处理器30可根据预设的上采样参数重新确定滤波半径,并对待处理图像块进行滤波处理。
例如,上采样参数为2,则表示进行滤波处理时,可每个像素采集两次,从而使得进行滤波处理的像素数提高为原来的2,从而提高滤波效果。为了保证按照上采样参数进行滤波处理时,每个像素均能够被正常地进行滤波处理,需要根据上采样参数来重新确定当前滤波层的滤波半径。例如,上采样参数为2时,可将当前滤波层的子滤波半径减小上采样参数/2+1(即,2),在上采样参数为4时,可将当前滤波层的子滤波半径减小上采样参数/2+1(即,3),从而重新确定当前滤波层的滤波半径。如此,可保证按照上采样参数进行滤波处理时,每个像素均能够被正常地进行滤波处理。
请参阅图13,本申请实施方式的一种存储有计算机程序302的非易失性计算机可读存储介质300,当计算机程序302被一个或多个处理器30执行时,使得处理器30可执行上述任一实施方式的图像处理方法。
例如,请结合图1,当计算机程序302被一个或多个处理器30执行时,使得处理器30执行以下步骤:
011:根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波 区域;
012:获取相邻的待处理图像块的滤波区域之间的重合区域;及
013:获取第一图像数据和第二图像数据,第一图像数据包括待处理图像块的滤波区域中,重合区域之外的区域的图像数据,第二图像数据包括待处理图像块的重合区域的图像数据,其中,在进行滤波处理时,对应的同一重合区域的相邻两个待处理图像块复用第二图像数据。
再例如,请结合图7,当计算机程序302被一个或多个处理器30执行时,处理器30还可以执行以下步骤:
014:根据当前滤波层的预设的子滤波半径,及当前滤波层之后的滤波层的预设的子滤波半径,确定当前滤波层的滤波半径。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”或“一些示例”等的描述意指结合实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的程序的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施方式,可以理解的是,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (20)

  1. 一种图像处理方法,其特征在于,还包括:
    根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;
    获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及
    获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
  2. 根据权利要求1所述的图像处理方法,其特征在于,所述滤波层包括多层,所述待处理图像依次经过多层所述滤波层的滤波处理,所述图像处理方法还包括:
    根据所述当前滤波层的预设的子滤波半径,及所述当前滤波层之后的所述滤波层的预设的所述子滤波半径,确定所述当前滤波层的所述滤波半径。
  3. 根据权利要求2所述的图像处理方法,其特征在于,还包括:
    根据所述当前滤波层之后的所述滤波层的所述子滤波半径,确定所述当前滤波层的所述滤波区域的输出区域;
    根据所述当前待处理图像块的位置信息和所述输出区域、及与所述当前待处理图像块相邻的已完成滤波处理的待处理图像块的所述输出区域,确定所述当前待处理图像块的计算区域;
    根据所述当前滤波层的所述子滤波半径、及所述计算区域,重新确定所述第一滤波区域;
    所述获取第一图像数据和第二图像数据,包括:
    获取所述第一图像数据和所述第二图像数据的组成的集合中,与重新确定的所述第一滤波区域对应的图像数据。
  4. 根据权利要求3所述的图像处理方法,其特征在于,所述位置信息包括所述待处理图像的第一边缘、第二边缘和中间位置中至少一个,所述根据所述当前待处理图像块的位置信息和所述输出区域、及与所述当前待处理图像块相邻的已完成滤波处理的待处理图像块的所述输出区域,确定所述当前待处理图像块的计算区域,包括:
    在所述位置信息包括所述第一边缘时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第一边缘相邻的所述待处理图像块的所述输出区域,确定所述计算区域;
    在所述位置信息包括所述第二边缘时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第二边缘相邻的所述待处理图像块的所述输出区域,确定所述计算区域;
    在所述位置信息包括所述第一边缘和所述第二边缘时,根据所述当前待处理图像块的所述输出区域,确定所述计算区域;
    在所述位置信息为所述中间位置时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第一边缘和所述第二边缘分别相邻的两个所述待处理图像块的所述输出区域,确定所述计算区域。
  5. 根据权利要求3所述的图像处理方法,其特征在于,还包括:
    对重新确定的所述第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;
    根据所述当前待处理图像块的所述位置信息,获取与所述当前待处理图像块相邻的已完成滤波处理的所述待处理图像块的所述输出区域中,与所述当前待处理图像块的所述输出区域重合的重合部分的第二滤波数据;
    根据所述第一滤波数据和所述第二滤波数据,输出所述当前待处理图像块的所述 输出区域的滤波数据。
  6. 根据权利要求1所述的图像处理方法,其特征在于,所述滤波半径包括第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径,所述第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径均相同;或者所述第一滤波半径和所述第三滤波半径相同,所述第二滤波半径和所述第四滤波半径相同;或者,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径互不相同。
  7. 根据权利要求1所述的图像处理方法,其特征在于,还包括:
    在完成对所述待处理图像块的滤波处理后,存储所述第二图像数据到内存。
  8. 根据权利要求1所述的图像处理方法,其特征在于,还包括:
    根据预设的下采样参数重新确定所述滤波半径,并对所述待处理图像块进行滤波处理。
  9. 根据权利要求5所述的图像处理方法,其特征在于,还包括:
    在所述当前待处理图像块完成滤波处理并输出所述当前待处理图像块的所述输出区域的滤波数据后,根据所述当前待处理图像块的位置信息确定与所述当前待处理图像块相邻且未进行滤波处理的待处理图像块。
  10. 一种图像处理装置,其特征在于,包括:
    第一确定模块,用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;
    第一获取模块,用于获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及
    第二获取模块,用于获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
  11. 一种终端,其特征在于,包括处理器,所述处理器用于根据当前滤波层的滤波半径,确定待处理图像中,每个待处理图像块的滤波区域;获取当前待处理图像块的第一滤波区域和与所述当前待处理图像块相邻的待处理图像块的第二滤波区域之间的重合区域;及获取第一图像数据和第二图像数据,所述第一图像数据包括所述第一滤波区域中,所述重合区域之外的区域的图像数据,所述第二图像数据包括所述重合区域的图像数据,所述第二图像数据供所述相邻的待处理图像块进行滤波时使用。
  12. 根据权利要求11所述的终端,其特征在于,所述滤波层包括多层,所述待处理图像依次经过多层所述滤波层的滤波处理,所述处理器还用于根据所述当前滤波层的预设的子滤波半径,及所述当前滤波层之后的所述滤波层的预设的所述子滤波半径,确定所述当前滤波层的所述滤波半径。
  13. 根据权利要求12所述的终端,其特征在于,所述处理器还用于根据所述当前滤波层之后的所述滤波层的所述子滤波半径,确定所述当前滤波层的所述滤波区域的输出区域;根据所述当前待处理图像块的位置信息和所述输出区域、及与所述当前待处理图像块相邻的已完成滤波处理的待处理图像块的所述输出区域,确定所述当前待处理图像块的计算区域;根据所述当前滤波层的所述子滤波半径、及所述计算区域,重新确定所述第一滤波区域;获取所述第一图像数据和所述第二图像数据的组成的集合中,与重新确定的所述第一滤波区域对应的图像数据。
  14. 根据权利要求13所述的终端,其特征在于,所述位置信息包括所述待处理图像的第一边缘、第二边缘和中间位置中至少一个,所述处理器还用于在所述位置信息包括所述第一边缘时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第一边缘相邻的所述待处理图像块的所述输出区域,确定所述计算 区域;在所述位置信息包括所述第二边缘时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第二边缘相邻的所述待处理图像块的所述输出区域,确定所述计算区域;在所述位置信息包括所述第一边缘和所述第二边缘时,根据所述当前待处理图像块的所述输出区域,确定所述计算区域;在所述位置信息为所述中间位置时,根据所述当前待处理图像块的所述输出区域、及与所述当前待处理图像块的所述第一边缘和所述第二边缘分别相邻的两个所述待处理图像块的所述输出区域,确定所述计算区域。
  15. 根据权利要求13所述的终端,其特征在于,所述处理器还用于对重新确定的所述第一滤波区域对应的图像数据进行滤波处理,以获取第一滤波数据;根据所述当前待处理图像块的所述位置信息,获取与所述当前待处理图像块相邻的已完成滤波处理的所述待处理图像块的所述输出区域中,与所述当前待处理图像块的所述输出区域重合的重合部分的第二滤波数据;根据所述第一滤波数据和所述第二滤波数据,输出所述当前待处理图像块的所述输出区域的滤波数据。
  16. 根据权利要求11所述的终端,其特征在于,所述滤波半径包括第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径,所述第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径均相同;或者所述第一滤波半径和所述第三滤波半径相同,所述第二滤波半径和所述第四滤波半径相同;或者,第一滤波半径、第二滤波半径、第三滤波半径和第四滤波半径互不相同。
  17. 根据权利要求11所述的终端,其特征在于,所述处理器还用于在完成对所述待处理图像块的滤波处理后,存储所述第二图像数据到内存。
  18. 根据权利要求11所述的终端,其特征在于,所述处理器还用于根据预设的下采样参数重新确定所述滤波半径,并对所述待处理图像块进行滤波处理。
  19. 根据权利要求15所述的终端,其特征在于,所述处理器还用于在所述当前待处理图像块完成滤波处理并输出所述当前待处理图像块的所述输出区域的滤波数据后,根据所述当前待处理图像块的位置信息确定与所述当前待处理图像块相邻且未进行滤波处理的待处理图像块。
  20. 一种包括计算机程序的非易失性计算机可读存储介质,所述计算机程序被处理器执行时,使得所述处理器执行权利要求1-9任意一项所述的图像处理方法。
PCT/CN2022/139725 2022-02-10 2022-12-16 图像处理方法、装置、终端和可读存储介质 WO2023151385A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210126024.6A CN114519661A (zh) 2022-02-10 2022-02-10 图像处理方法、装置、终端和可读存储介质
CN202210126024.6 2022-02-10

Publications (1)

Publication Number Publication Date
WO2023151385A1 true WO2023151385A1 (zh) 2023-08-17

Family

ID=81596717

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/139725 WO2023151385A1 (zh) 2022-02-10 2022-12-16 图像处理方法、装置、终端和可读存储介质

Country Status (2)

Country Link
CN (1) CN114519661A (zh)
WO (1) WO2023151385A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114519661A (zh) * 2022-02-10 2022-05-20 Oppo广东移动通信有限公司 图像处理方法、装置、终端和可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968769A (zh) * 2012-11-27 2013-03-13 西安交通大学 一种图像一致性增强装置
CN108282597A (zh) * 2018-01-05 2018-07-13 佛山市顺德区中山大学研究院 一种基于fpga的实时目标追踪系统和方法
CN109255771A (zh) * 2018-08-22 2019-01-22 广州兴森快捷电路科技有限公司 图像滤波方法及装置
US20190089954A1 (en) * 2017-09-19 2019-03-21 Fujitsu Limited Information processing apparatus, information processing method, and information processing program
CN110148087A (zh) * 2019-05-22 2019-08-20 杭州电子科技大学 基于稀疏表示的图像压缩及重建方法
CN114519661A (zh) * 2022-02-10 2022-05-20 Oppo广东移动通信有限公司 图像处理方法、装置、终端和可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968769A (zh) * 2012-11-27 2013-03-13 西安交通大学 一种图像一致性增强装置
US20190089954A1 (en) * 2017-09-19 2019-03-21 Fujitsu Limited Information processing apparatus, information processing method, and information processing program
CN108282597A (zh) * 2018-01-05 2018-07-13 佛山市顺德区中山大学研究院 一种基于fpga的实时目标追踪系统和方法
CN109255771A (zh) * 2018-08-22 2019-01-22 广州兴森快捷电路科技有限公司 图像滤波方法及装置
CN110148087A (zh) * 2019-05-22 2019-08-20 杭州电子科技大学 基于稀疏表示的图像压缩及重建方法
CN114519661A (zh) * 2022-02-10 2022-05-20 Oppo广东移动通信有限公司 图像处理方法、装置、终端和可读存储介质

Also Published As

Publication number Publication date
CN114519661A (zh) 2022-05-20

Similar Documents

Publication Publication Date Title
US11823317B2 (en) Single pass rendering for head mounted displays
US10356385B2 (en) Method and device for stereo images processing
US9554048B2 (en) In-stream rolling shutter compensation
US9959600B2 (en) Motion image compensation method and device, display device
US10080007B2 (en) Hybrid tiling strategy for semi-global matching stereo hardware acceleration
CN105528758B (zh) 基于可编程逻辑器件的图像重映射方法及装置
WO2023151385A1 (zh) 图像处理方法、装置、终端和可读存储介质
EP2347385A1 (en) Method and system for image resizing based on interpolation enhanced seam operations
JP2011065560A (ja) 画像処理装置、及び画像処理方法
KR20150025594A (ko) 멀티 이미지 레이어 컴포지트 방법
CN107220930A (zh) 鱼眼图像处理方法、计算机装置及计算机可读存储介质
US11212435B2 (en) Semiconductor device for image distortion correction processing and image reduction processing
WO2023151386A1 (zh) 数据处理方法、装置、终端和可读存储介质
CN115035128A (zh) 基于fpga的图像重叠滑窗分割方法及系统
US20160284043A1 (en) Graphics processing
JPWO2013021525A1 (ja) 画像処理装置、画像処理方法、プログラム、及び集積回路
CN108280801A (zh) 基于双线性插值的重映射方法、装置和可编程逻辑器件
CN111914988A (zh) 神经网络设备、计算系统和处理特征图的方法
US20230093967A1 (en) Purple-fringe correction method and purple-fringe correction device
RU168781U1 (ru) Устройство обработки стереоизображений
US11974062B2 (en) Dynamic configuration of perspective transformation engine
US20220345592A1 (en) Image sensor module, image processing system, and operating method of image sensor module
US10565778B2 (en) Electronic devices for and methods of implementing memory transfers for image warping in an electronic device
JP2006338334A (ja) データ処理装置及びデータ処理方法
JP2005227479A (ja) 画像処理装置、画像処理方法及び画像処理方法をコンピュータに実行させるためのプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22925735

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE