CN113658085B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN113658085B
CN113658085B CN202111218339.5A CN202111218339A CN113658085B CN 113658085 B CN113658085 B CN 113658085B CN 202111218339 A CN202111218339 A CN 202111218339A CN 113658085 B CN113658085 B CN 113658085B
Authority
CN
China
Prior art keywords
image
frequency
processed
information density
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111218339.5A
Other languages
Chinese (zh)
Other versions
CN113658085A (en
Inventor
李超超
王晔
李东朔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youmu Technology Co ltd
Original Assignee
Beijing Youmu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youmu Technology Co ltd filed Critical Beijing Youmu Technology Co ltd
Priority to CN202111218339.5A priority Critical patent/CN113658085B/en
Publication of CN113658085A publication Critical patent/CN113658085A/en
Application granted granted Critical
Publication of CN113658085B publication Critical patent/CN113658085B/en
Priority to PCT/CN2022/083953 priority patent/WO2023065604A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection

Abstract

The present specification provides an image processing method and apparatus, wherein the image processing method includes: acquiring an image to be processed; converting the image to be processed into a gray image, and performing wavelet transformation on the gray image to obtain high-frequency information of the image to be processed; creating a high-frequency image corresponding to the image to be processed based on the high-frequency information; and processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to the processing result.

Description

Image processing method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
With the development of internet technology, image processing technology is applied in more and more fields, and especially in scenes with visual resource embedding requirements, such as live scenes, teaching scenes, explanation scenes and the like, visual resources corresponding to an explanation object need to be embedded into the explained display content, so that the experience effect of watching users is improved. In the prior art, each area in the display content is usually directly covered by a rectangular frame, and the visual resources are embedded in the rectangular frame to achieve the purpose of explanation; however, the processing method does not consider the density of the information in the display content, and the information is easily blocked, so how to accurately calculate the density of the information in the display content becomes a problem to be solved urgently.
Disclosure of Invention
In view of this, embodiments of the present specification provide an image processing method. The present specification also relates to an image processing apparatus, a presentation method, a presentation apparatus, a computing device, and a computer-readable storage medium, which are used to solve the problem of the prior art that the density of image information is not considered.
According to a first aspect of embodiments herein, there is provided an image processing method including:
acquiring an image to be processed;
converting the image to be processed into a gray image, and performing wavelet transformation on the gray image to obtain high-frequency information of the image to be processed;
creating a high-frequency image corresponding to the image to be processed based on the high-frequency information;
and processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to the processing result.
Optionally, the high frequency information comprises at least one of:
horizontal high frequency information, vertical high frequency information, diagonal high frequency information.
Optionally, the creating a high-frequency image corresponding to the image to be processed based on the high-frequency information includes:
calculating pixel information corresponding to each pixel point in the image to be processed based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information;
and mapping the value of each pixel point in the image to be processed to a gray scale interval according to the pixel information, and creating the high-frequency image according to a mapping result.
Optionally, the processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to the processing result includes:
analyzing the image information density calculation strategy, and determining a target sliding window and a target sliding step length according to an analysis result;
moving the target sliding window in the high-frequency image according to the target sliding step length, and calculating the image information density corresponding to each high-frequency image area in the high-frequency image according to the moving result;
and determining the image information density corresponding to each image area in the image to be processed based on the image information density corresponding to each high-frequency image area in the high-frequency image.
Optionally, the determining a target sliding window and a target sliding step according to the analysis result includes:
determining an initial sliding window and an initial sliding step length according to the analysis result, and determining the conversion relation between the image to be processed and the high-frequency image;
and adjusting the initial sliding window and the initial sliding step length based on the conversion relation to obtain the target sliding window and the target sliding step length.
Optionally, the image information density corresponding to any one high-frequency image region is determined as follows:
reading the pixel information density corresponding to each pixel point in a target image area of the image to be processed;
calculating the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area of the high-frequency image according to the pixel information density corresponding to each pixel point in the target image area;
and calculating the average value of the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area, and determining the image information density corresponding to the target high-frequency image area according to the calculation result.
Optionally, the determining, based on the image information density corresponding to each high-frequency image region in the high-frequency image, the image information density corresponding to each image region in the image to be processed includes:
establishing a mapping relation between each high-frequency image area in the high-frequency image and each image area in the image to be processed;
and determining the image information density corresponding to each image area in the image to be processed based on the mapping relation and the image information density corresponding to each high-frequency image area.
Optionally, after the step of determining the image information density corresponding to each image area in the image to be processed according to the processing result is executed, the method further includes:
selecting an image area corresponding to the minimum image information density as an added image area;
and reading the visual resources matched with the image to be processed, and adding the visual resources to the image adding area.
Optionally, the adding the visualization resource to the added image area includes:
determining a transparentization level based on the image information density corresponding to the added image area;
and performing transparentization processing on the visual resources according to the transparentization level, and adding the visual resources subjected to transparentization processing in the image adding area.
According to a second aspect of embodiments herein, there is provided an image processing apparatus comprising:
an acquisition module configured to acquire an image to be processed;
the conversion module is configured to convert the image to be processed into a gray image and perform wavelet transformation on the gray image to obtain high-frequency information of the image to be processed;
the creating module is configured to create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
and the processing module is configured to process the high-frequency image according to a preset image information density calculation strategy, and determine the image information density corresponding to each image area in the image to be processed according to the processing result.
According to a third aspect of embodiments herein, there is provided a presentation method, comprising:
acquiring an image to be processed, and converting the image to be processed into a gray image;
performing wavelet transformation on the gray level image to obtain high-frequency information of the image to be processed, and creating a high-frequency image corresponding to the image to be processed based on the high-frequency information;
processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to an initial image area in the image to be processed according to a processing result;
and screening a target image area in the initial image area based on the image information density, and adding visual resources to the target image area in the image to be processed for displaying.
Optionally, the initial image area is determined by:
determining a target service corresponding to the image to be processed;
loading an image area positioning strategy corresponding to the target service;
and determining the initial image area in the area to be processed according to the image area positioning strategy.
Optionally, the screening a target image region in the initial image region based on the image information density includes:
comparing the image information density corresponding to each initial image area;
and selecting an initial image area corresponding to the minimum image density information as the target image area according to the comparison result.
Optionally, the adding a visualization resource to the target image region in the image to be processed for displaying includes:
determining the information density of the target image corresponding to the target image area;
determining a target transparentization level corresponding to the target image information density based on the corresponding relation between the image information density and the transparentization level;
and performing transparency processing on the visual resources according to the target transparency level, and adding the visual resources subjected to transparency processing to the target image area in the processed image for displaying.
According to a fourth aspect of embodiments herein, there is provided a display apparatus comprising:
the image acquisition module is configured to acquire an image to be processed and convert the image to be processed into a gray image;
the image creating module is configured to perform wavelet transformation on the grayscale image to obtain high-frequency information of the image to be processed, and create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
the image processing module is configured to process the high-frequency image according to a preset image information density calculation strategy and determine the image information density corresponding to an initial image area in the image to be processed according to a processing result;
and the screening area module is configured to screen a target image area in the initial image area based on the image information density, and add visual resources to the target image area in the image to be processed for displaying.
According to a fifth aspect of embodiments herein, there is provided a computing device comprising:
a memory and a processor;
the memory is for storing computer-executable instructions, the processor being for implementing the steps of the image processing method or presentation method when executing the computer-executable instructions.
According to a sixth aspect of embodiments herein, there is provided a computer-readable storage medium storing computer-executable instructions that, when executed by a processor, implement the steps of the image processing method or the presentation method.
In the image processing method provided by the present specification, after an image to be processed is obtained, in order to accurately calculate the image information density of each region in the image to be processed in the following step, the image to be processed may be first converted into a grayscale image, and then wavelet conversion is performed on the grayscale image to obtain high-frequency information of the image to be processed; and then, a high-frequency image corresponding to the image to be processed is created based on the high-frequency information, so that the image information density corresponding to each image area in the image to be processed is obtained in a high-frequency dimension, the image information density calculation accuracy of each image area is effectively ensured, the subsequent embedding of visual resources based on the image information density is facilitated, and the information in the image resources to be processed is prevented from being shielded.
Drawings
Fig. 1 is a flowchart of an image processing method provided in an embodiment of the present specification;
FIG. 2 is a diagram illustrating an image transformation process provided in an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an image to be displayed according to an embodiment of the present disclosure;
FIG. 4 is a diagram of a layout of a visual image provided by an embodiment of the present description;
FIG. 5 is a schematic diagram of a visualization image provided by an embodiment of the present description;
FIG. 6 is a diagram of a high frequency image provided in an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a first image information density according to an embodiment of the present disclosure;
FIG. 8 is a diagram illustrating a second image information density according to an embodiment of the present disclosure;
FIG. 9 is a diagram of a first type of embedded visualization resource provided by an embodiment of the present specification;
fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present specification;
FIG. 11 is a flow chart of a presentation method provided by an embodiment of the present disclosure;
FIG. 12 is a diagram illustrating a third density of image information provided in an embodiment of the present disclosure;
FIG. 13 is a diagram of a second type of embedded visualization resource provided by an embodiment of the present specification;
FIG. 14 is a schematic structural diagram of a display device according to an embodiment of the present disclosure;
fig. 15 is a block diagram of a computing device according to an embodiment of the present disclosure.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present description. This description may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, as those skilled in the art will be able to make and use the present disclosure without departing from the spirit and scope of the present disclosure.
The terminology used in the description of the one or more embodiments is for the purpose of describing the particular embodiments only and is not intended to be limiting of the description of the one or more embodiments. As used in one or more embodiments of the present specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present specification refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first can also be referred to as a second and, similarly, a second can also be referred to as a first without departing from the scope of one or more embodiments of the present description. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the present specification, an image processing method is provided, and the present specification relates to an image processing apparatus, a presentation method, a presentation apparatus, a computing device, and a computer-readable storage medium, which are described in detail one by one in the following embodiments.
Fig. 1 is a flowchart illustrating an image processing method according to an embodiment of the present specification, which specifically includes the following steps:
step S102, acquiring an image to be processed.
Specifically, the image to be processed specifically refers to an image which needs to be subjected to information density calculation, and the image subjected to the information density calculation can determine the image information density of each image area in the image to be processed, so that the area with the maximum image information density and the area with the minimum image information density in the image to be processed can be conveniently determined, wherein the image information density refers to the area covered by information in the image area, and the larger the covered area is, the higher the image information density is, and conversely, the smaller the information covered area is, the lower the image information density is. It should be noted that the size of the image area may be set according to actual requirements, and the embodiment is not limited herein.
In practical application, in a live broadcast scene, a teaching scene or a comment scene, the image information density of each image area in a video frame image can be determined by calculating the image information density of the displayed video frame image, so that the minimum image area can be selected according to the image information density to be embedded into a visual resource, and the visual resource can be a main broadcast head portrait picture, a teacher teaching picture or a director picture and the like, so that the information amount of the visual resource covering the video frame image is reduced, the information in the video frame image is prevented from being shielded, and the viewing experience of a user is improved.
Based on this, the image to be processed may be a video frame image at a certain time in a live broadcast scene, may also be a certain page PPT in a lecture scene, and may also be a video frame image at a certain time in a commentary scene, which is not limited in this embodiment.
In addition, when the layout and the layout of the image to be processed are performed, for example, new elements are embedded in the image to be processed, in order to avoid mutual occlusion between the elements, the image information density of each image area may also be calculated by using the image processing method provided in this embodiment, so as to conveniently select the image area with the minimum image information density to add the new elements, and complete the layout and the layout of the image to be processed, and the specific description contents may refer to the corresponding description contents in this embodiment, which is not described in detail herein.
In this embodiment, a teaching scene is taken as an example for explanation, and accordingly, an image to be processed is a certain page of PPT that needs to be displayed to a user listening to a teaching in the teaching scene, and by calculating the information density of each image region in the page of PPT, an image region with the minimum image information density can be selected to place a lecture picture, so as to avoid shielding the contents of the PPT.
And step S104, converting the image to be processed into a gray image, and performing wavelet transformation on the gray image to obtain high-frequency information of the image to be processed.
Specifically, on the basis of obtaining the image to be processed, in order to accurately calculate the information density of each image area in the image to be processed, the image to be processed may be converted into a grayscale image, and then the grayscale image is subjected to wavelet transformation, so as to obtain high-frequency information of the image to be processed, so as to facilitate subsequent calculation of the image information density of each image area in a high-frequency dimension.
The gray level image specifically refers to an image which is converted from a three-channel (RGB) image to be processed into a single-channel image; correspondingly, the high-frequency information specifically refers to information corresponding to the image to be processed in the high-frequency dimension, which is obtained after wavelet transformation, and includes, but is not limited to, horizontal high-frequency information, vertical high-frequency information, and diagonal high-frequency information corresponding to the image to be processed. The horizontal high-frequency information is used for representing the details of the image to be processed in the horizontal dimension, the vertical high-frequency information is used for representing the details of the image to be processed in the vertical dimension, and the diagonal high-frequency information is used for representing the details of the image to be processed in the diagonal dimension.
That is to say, the image processing method provided by this embodiment reflects the fluctuation value of the local change of the image information in the image to be processed by extracting the high-frequency feature of the image to be processed, and further calculates the information distribution details of the image to be processed based on this, so that the image information density of each image area in the image to be processed can be accurately calculated.
Further, the image to be processed after wavelet transformation can obtain corresponding low-frequency information and high-frequency information, wherein the low-frequency information reflects slow change of the image to be processed corresponding to the averaging; the high-frequency information corresponds to the difference value and reflects the fluctuation value of the image to be processed; the information quantity of each image area in the image to be processed can be reflected through the high-frequency information, so that the subsequent calculation of the image information density is facilitated.
Further, referring to the schematic diagram shown in fig. 2, when wavelet transforming a gray-scale image, in order to improve the processing efficiency, the gray-scale image may be input to the high-pass filter hhighAnd a low-pass filter hlowTo realize passing through a high-pass filter hhighGenerating images to be processedCorresponding vertical high-frequency information V and diagonal high-frequency information D, and passing through a low-pass filter hlowAnd generating horizontal high-frequency information H and low-frequency information A corresponding to the image to be processed, so that the subsequent calculation of the image information density is facilitated. Wherein the high-pass filter hhighAllowing high frequency information to pass, low pass filter hlowAllowing low frequency information to pass.
It should be noted that the high-frequency information of the image to be processed is to be able to create a high-frequency image for performing image information density calculation, and the creation of the high-frequency image is controlled by the high-frequency information { horizontal high-frequency information, vertical high-frequency information, diagonal high-frequency information }, so that there are visual images with corresponding dimensions in different dimensions (low-frequency dimension, horizontal dimension, vertical dimension, and diagonal dimension), respectively, and for facilitating subsequent image density information calculation, the visual images with corresponding dimensions are mapped to a range of 0-255 by mapping values of pixel points to create a grayscale visual image, which facilitates integration of the visual images with high-frequency dimension to obtain the high-frequency image of the image to be processed, thereby facilitating subsequent image information density calculation.
For example, fig. 3 is a page of PPT that needs to be displayed to a user, in order to improve an explanation effect of the PPT, a service platform provides an explanation picture in which an explanation teacher is added during the PPT explanation process, and in order to avoid that a region corresponding to the explanation picture occludes content in the PPT, image information density of each image region in a current PPT page needs to be calculated at this time.
Based on this, the to-be-processed image corresponding to the PPT page is first converted into a grayscale image, and then the grayscale image is subjected to wavelet transform to obtain low-frequency information a, horizontal high-frequency information H, vertical high-frequency information V, and diagonal high-frequency information D corresponding to the to-be-processed image, and a visualized image layout corresponding to the information is shown in fig. 4. Wherein, the visual image corresponding to the low-frequency information a is shown in fig. 5 (a); a visualized image of the horizontal high-frequency information H is shown in fig. 5 (b); a visualized image of the vertical high-frequency information V is shown in fig. 5 (c); a visualized image of the diagonal high-frequency information D is shown in fig. 5 (D); and then, a visual high-frequency image corresponding to the image to be processed is created by conveniently combining the horizontal high-frequency information H of the horizontal dimension, the vertical high-frequency information V of the vertical dimension and the diagonal high-frequency information D of the diagonal dimension, so that the image information density can be calculated.
In conclusion, the high-frequency information is acquired by converting the gray-scale image into the gray-scale image, so that preparation can be made for subsequently calculating the image information density of the image area, and the accuracy of calculating the image information density can be effectively improved, so that the image information density distribution details in the image to be processed can be more quickly analyzed.
And step S106, creating a high-frequency image corresponding to the image to be processed based on the high-frequency information.
Specifically, on the basis of obtaining the high-frequency information of the image to be processed, in order to accurately calculate the image information density of each image region, at this time, a high-frequency image corresponding to the image to be processed may be created in combination with the high-frequency information, so as to calculate the image information density of each image region in a high-frequency dimension, and then mapped into the image to be processed, so as to obtain the image information density of each image region in the image to be processed, thereby effectively improving the calculation accuracy and the calculation efficiency. It should be noted that the high-frequency image specifically refers to an image obtained by mapping an image to be processed to a high-frequency dimension, and a value of each pixel in the image is mapped in a range of 0 to 255, so as to be displayed in a gray-scale image manner. And subsequently, the image information density of each image area can be calculated by analyzing the values of the pixel points.
Further, when a high-frequency image corresponding to the image to be processed is created, because the high-frequency information of the image to be processed includes horizontal high-frequency information, vertical high-frequency information, and diagonal high-frequency information, the high-frequency image combining three-dimensional features can be created by combining the high-frequency information of three dimensions, so as to facilitate subsequent calculation of image information density, in this embodiment, the specific implementation manner is as follows:
calculating pixel information corresponding to each pixel point in the image to be processed based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information;
and mapping the value of each pixel point in the image to be processed to a gray scale interval according to the pixel information, and creating the high-frequency image according to a mapping result.
Specifically, the pixel information specifically refers to a value of each pixel point in a high-frequency dimension calculated after combining horizontal high-frequency information, vertical high-frequency information and diagonal high-frequency information; correspondingly, the gray scale interval specifically refers to a value range of 0-255.
Based on the method, after the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information of the image to be processed are obtained, the value of each pixel point in the image to be processed in the high-frequency dimension can be calculated by combining the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information, then the value of each pixel point is mapped into the range of 0-255, the mapping value of each pixel point in the high-frequency dimension can be obtained, the high-frequency image corresponding to the high-frequency dimension can be created based on the position of each pixel point, and the high-frequency image combines the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information, so that the accuracy of the information density of the subsequent calculated image can be effectively guaranteed.
In the process, when the pixel information of each pixel point in the image to be processed is calculated based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information, the squaring can be respectively carried out based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information corresponding to each pixel point, then the average value is obtained, finally the squaring is carried out to obtain the value of each pixel point, the value is mapped to the value range of 0-255 to obtain the mapping value of each pixel point in the high-frequency dimension, and finally the high-frequency image of the image to be processed can be spliced according to the mapping value of each pixel point.
Along the above example, after obtaining the high frequency information { horizontal high frequency information H, vertical high frequency information V and diagonal high frequency information D } of the image to be processed, at this time, the horizontal high frequency information Hn, vertical high frequency information Vn and diagonal high frequency information Dn of each pixel point in the image to be processed can be respectively taken, and then the formula is followed
Figure 732662DEST_PATH_IMAGE001
The value of each pixel point in the high-frequency dimension can be calculated, and finally the value of each pixel point in the high-frequency dimension is mapped to the range of 0-255 to obtain the mapping value of each pixel point in the high-frequency dimension, and a high-frequency image of the image to be processed is created according to the mapping value of each pixel point to obtain the high-frequency image shown in fig. 6, wherein the high-frequency image combines high-frequency information { horizontal high-frequency information H, vertical high-frequency information V and diagonal high-frequency information D } of the image to be processed, so that the image information density of each image area can be calculated on the basis of the image.
In conclusion, the high-frequency image is created by combining the high-frequency information, so that the high-frequency image is ensured to be combined with the high-frequency information of the image to be processed, the follow-up calculation of the image information density can be ensured to be completed on the basis of the gray image, and the accuracy of calculating the image information density is effectively improved.
And S108, processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to a processing result.
Specifically, after the high-frequency image is obtained, the high-frequency image corresponding to the image to be processed may be further processed according to a preset image information density calculation strategy, so as to achieve that the image information density of each image area is calculated in the high-frequency dimension, and then mapped into the image to be processed, so as to obtain the image information density of each image area in the image to be processed.
The image information density calculation strategy specifically refers to moving in a high-frequency image according to a preset sliding window and a preset step length, after each movement, calculating the image information density of a region corresponding to the sliding window at the moment in the high-frequency image, after the sliding window is moved, obtaining the image information density of regions corresponding to n sliding windows in the high-frequency image, and then mapping the image information density of the regions in the n high-frequency images to the image to be processed, so as to obtain the image information density of each image region in the image to be processed.
Further, when calculating the image information density of each image region in the image to be processed, in order to ensure the accuracy of the calculation, the calculation of the image information density may be completed by performing high-frequency dimension processing and then mapping to the image to be processed, in this embodiment, a specific implementation manner is as follows:
step S1082, analyzing the image information density calculation strategy, and determining a target sliding window and a target sliding step length according to the analysis result;
specifically, the target sliding window specifically refers to a rectangular frame with a set size, and the image information density of each image area is obtained by mapping the image information density corresponding to the framed area of the rectangular frame in the high-frequency image; correspondingly, the target sliding step specifically refers to a pixel value of each movement of the target sliding window in the high-frequency image.
Further, since the image information density is calculated in the high-frequency dimension and then mapped to the image to be processed to obtain the image information density of each image region, and a preset sliding window and a preset sliding step are both set for the image to be processed, which cannot be applied to the high-frequency image, and the high-frequency image is a feature expression of the image to be processed in the high-frequency dimension, and the size of the high-frequency image is one half of that of the image to be processed, after the sliding window and the sliding step corresponding to the image to be processed are obtained, the sliding window and the sliding step corresponding to the image to be processed need to be converted to obtain a target sliding window and a target sliding step applied to the high-frequency image, in this embodiment, the specific implementation manner is as follows:
determining an initial sliding window and an initial sliding step length according to the analysis result, and determining the conversion relation between the image to be processed and the high-frequency image; and adjusting the initial sliding window and the initial sliding step length based on the conversion relation to obtain the target sliding window and the target sliding step length.
Specifically, the initial sliding window is a sliding window preset by the pointer on the image to be processed, and the initial sliding step is a sliding step preset by the pointer on the image to be processed; correspondingly, the conversion relation specifically refers to a proportional relation between the image to be processed and the high-frequency image, and is controlled by corresponding parameters when the wavelet changes.
Based on this, when image information density calculation is needed, the image information density calculation strategy can be analyzed to obtain an initial sliding window and an initial sliding step length applied to the image to be processed, meanwhile, the conversion relation between the image to be processed and the high-frequency image is determined, and then the initial sliding window and the initial sliding step length are adjusted according to the conversion relation, so that a target sliding window and a target sliding step length applied to the high-frequency image can be obtained, and image information density of each area can be calculated according to the target sliding window and the target sliding step length subsequently.
It should be noted that the sizes of the sliding window and the sliding step may be set according to an actual application scenario, and the unit of the sliding window and the sliding step is a pixel, which is not limited herein.
In summary, by adjusting the initial sliding step length and the initial sliding window according to the conversion relationship, the target sliding window and the target sliding step length applied to the high-frequency image can be obtained, so that the subsequent calculation of the image information density on the high-frequency image can be facilitated, and the calculation is more comprehensive and accurate.
Step S1084, moving the target sliding window in the high-frequency image according to the target sliding step length, and calculating image information density corresponding to each high-frequency image area in the high-frequency image according to the moving result;
specifically, after the target sliding window and the target step size applied to the high-frequency image are determined, the target sliding window may be moved in the high-frequency image according to the target sliding step size, the image information density of each high-frequency image area in the high-frequency image is calculated according to each movement result, and the image information density of each image area in the image to be processed is obtained by subsequent mapping to the image to be processed.
Based on this, each high-frequency image area in the high-frequency image specifically refers to an area framed and selected by the target sliding window after each movement, and the image information density of each high-frequency image area specifically refers to the ratio of the information coverage area in the framed area, wherein the larger the ratio is, the higher the image information density is, and the smaller the ratio is, the lower the image information density is. The target sliding window corresponds to a high-frequency image area after moving every time, and each high-frequency image area corresponds to an image information density.
Further, when calculating the image information density of each high-frequency image region in the high-frequency image, in order to ensure that the calculation of the image information density is completed in the high-frequency dimension, the pixel information density of a pixel point in the image to be processed needs to be mapped to the high-frequency dimension, so as to realize the calculation of the image information density in the gray-scale image, which not only can improve the calculation accuracy, but also can ensure the calculation efficiency, in this embodiment, the calculation process of the image information density of any one high-frequency image region is as follows:
reading the pixel information density corresponding to each pixel point in a target image area of the image to be processed; calculating the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area of the high-frequency image according to the pixel information density corresponding to each pixel point in the target image area; and calculating the average value of the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area, and determining the image information density corresponding to the target high-frequency image area according to the calculation result.
Specifically, the target image area specifically refers to any image area in the image to be processed, where image information density calculation is required; the pixel information density specifically refers to the information density corresponding to each pixel point in the target image area; correspondingly, the target high-frequency image area specifically refers to a high-frequency image area corresponding to the target image area in the high-frequency image, and the high-frequency pixel information density specifically refers to information density corresponding to each pixel point in the target high-frequency image area. It should be noted that the pixel information density corresponding to each pixel point is determined based on the ratio of the information occupied area in the pixel point to the total area of the pixel point.
Based on this, when the target sliding window moves in the high-frequency image by the target sliding step, the image information density of any one high-frequency image area is calculated as follows: the method comprises the steps of firstly reading the pixel information density of each pixel point in a target image area of an image to be processed, wherein the area framed and selected by a target sliding window in a high-frequency image is the target high-frequency image area, and the area is mapped to the image area to be processed and then corresponds to the target image area, so that after the pixel information density corresponding to each pixel point in the target image area is determined, the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area can be calculated based on the mapping relation, namely, the similar information density corresponding to each pixel point after each pixel point is mapped to a high-frequency dimension. Furthermore, since the target high-frequency image region includes a plurality of pixel points, and the high-frequency pixel information density corresponding to each pixel point is different, in order to ensure the calculation accuracy, an averaging mode can be selected to determine the image information density corresponding to the target high-frequency image region, that is, the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image region is subjected to average value calculation, so as to obtain the image information density corresponding to the target high-frequency image region, and then the image information density of the target image region can be obtained by mapping the image information density to be processed. And so on thereafter. The image information density of each image area in the image to be processed can be obtained.
In specific implementation, when calculating the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image region of the high-frequency image according to the pixel information density corresponding to each pixel point in the target image region, the method can be implemented as follows: firstly, determining the pixel information density corresponding to any pixel point in an image to be processed, then comparing the pixel information density corresponding to the pixel point with the pixel information density max (namely the maximum pixel point information density) corresponding to the pixel point in the image to be processed, then multiplying the ratio by 255 (because the value of each pixel point in the high-frequency image is a value in the range of 0-255), finally obtaining any pixel point in the image to be processed according to the calculation result, calculating the high-frequency pixel information density of the corresponding pixel point in the target high-frequency image region by analogy, and obtaining the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image region by calculating each pixel point according to the method.
It should be noted that, because the high-frequency image is a grayscale image corresponding to the image to be processed in the high-frequency dimension, the closer the value of each pixel point in the high-frequency image is to 255, the closer the pixel point is to white, which further indicates that the image information density of the pixel point is greater, otherwise, the closer the value of the pixel point is to 0, which indicates that the pixel point is closer to black, which further indicates that the image information density of the pixel point is smaller.
In summary, by calculating the image information density of the target high-frequency image region by taking the pixel point as a unit, not only is the calculation accuracy of the image information density of each high-frequency image region improved, but also the calculation efficiency can be ensured, so that the image information density of each image region in the image to be processed can be conveniently and quickly mapped out subsequently.
Step S1086, determining the image information density corresponding to each image area in the image to be processed based on the image information density corresponding to each high-frequency image area in the high-frequency image.
Specifically, after the above-mentioned calculation of the image information density of each high-frequency image region in the high-frequency image is completed, further, since the image to be processed and the high-frequency image have a conversion relationship, the image information density corresponding to each image region in the image to be processed may be determined based on the image information density corresponding to each high-frequency image region in the high-frequency image, so as to complete the mapping from high frequency to the image to be processed, and realize the calculation of the image information density of each image region.
Further, when determining the image information density of each image area in the image to be processed, the image information density may be determined by combining the mapping relationship between the image to be processed and the high-frequency image, and in this embodiment, the specific implementation manner is as follows:
establishing a mapping relation between each high-frequency image area in the high-frequency image and each image area in the image to be processed; and determining the image information density corresponding to each image area in the image to be processed based on the mapping relation and the image information density corresponding to each high-frequency image area.
Specifically, the mapping relationship between each pixel point in the high-frequency image and each pixel point in the image to be processed can be determined through the mapping relationship, so that the mapping relationship between the pixel points is convenient to establish the mapping relationship between the image area and the high-frequency image area, and the image information density of each image area in the image to be processed can be determined based on the mapping relationship. That is, the image information density of the high-frequency image area is taken as the image information density of the image area having the mapping relationship.
Based on this, after the image information density of each high-frequency image area in the high-frequency image is obtained through calculation, a mapping relation between each high-frequency image area in the high-frequency image and each image area in the image to be processed can be established, and then the image information density of each high-frequency image area is given to the corresponding image area based on the mapping relation, so that the image information density of each image area in the image to be processed is obtained.
In summary, after the calculation of the image information density of each high-frequency image region is completed in the high-frequency dimension, the image information density of each image region in the image to be processed can be determined based on the mapping relationship between the image to be processed and the high-frequency image, so that the accuracy of the image information density can be effectively ensured.
After obtaining the high-frequency image corresponding to the image to be processed, the preset image information density calculation strategy can be analyzed to obtain a square frame of 256 × 256 pixels of the initial sliding window corresponding to the image to be processed and an initial sliding step of 32 pixels; and then, based on the conversion relation between the image to be processed and the high-frequency image, adjusting the initial sliding window and the initial sliding step to obtain a target sliding window of 128 × 128 pixels and a target sliding step of 16 pixels.
Further, moving a target sliding window of 128 × 128 pixels in the high-frequency image according to a target sliding step of 16 pixels, obtaining the image information density of a high-frequency image area corresponding to the target sliding window in the high-frequency image after each sliding, and obtaining the image information density corresponding to each high-frequency image area in the high-frequency image after the target sliding window is moved; then based on the mapping relation between the image to be processed and the high-frequency image, the image information density of each image area in the image to be processed can be obtained, and therefore the minimum image information density of the S1 th image area in the image to be processed is determined to be 0.0166; the image information density of the S2 th image area in the image to be processed is the maximum, and is 0.1088; after the image information density is calculated, the schematic diagram as shown in fig. 7 can be generated, and two areas with the maximum density and the minimum density are represented by rectangular boxes carrying the image information density.
In addition, in a business scene, since the image information density needs to be recalculated each time the display content is changed, and the image information density changes with the change of the display content, the area corresponding to the maximum/minimum image information density also changes, if the placement position of the visual resource needs to be adjusted each time the change is made, the position of the visual resource changes at will, and the viewing experience of the user is greatly affected, therefore, in order to avoid the random change of the position of the visual resource, the image area with the minimum image information density can be selected as the additional image area in the designated image area, and then the embedding process of the visual resource is performed, in this embodiment, the specific implementation manner is as follows:
selecting an image area corresponding to the minimum image information density as an added image area; and reading the visual resources matched with the image to be processed, and adding the visual resources to the image adding area.
Specifically, the added image area specifically refers to an image area with the minimum image information density in one or more designated image areas in the image to be processed, and correspondingly, the visualization resource specifically refers to resources such as an image, a video, and an animation that need to be added to the added image area, which is not limited in this embodiment.
In the specific implementation, since the image to be processed is changed at any time, every time the image to be processed is changed, the image information density of each image area in the current image to be processed needs to be recalculated, if the smallest image area is directly selected as the added image area, the problem that the position of the visualization resource changes along with the change of the image to be processed can occur, and the viewing experience of the user can be seriously influenced, so that one or more image areas can be designated as candidate image areas, after the image information density of each image region is calculated, the image information density of the candidate image regions may be determined directly, and then based on the image information density of each image region in the specified one or more image regions, and selecting an area with the minimum image information density from the current image to be processed as an added image area so as to realize the embedding processing operation of the visualization resources.
Along the above example, the lower left corner and the lower right corner in the display page of the PPT are designated as candidate image regions, and after image information density calculation is performed on each image region of the image to be processed, as shown in fig. 8, it is determined that the image information density of the candidate image region corresponding to the lower left corner is 0.0319, and the image information density of the candidate image region corresponding to the lower right corner is 0.0225; and further determining that the image information density of the candidate image region corresponding to the lower right corner is minimum, selecting the region as an added image region, adding a head portrait of a teaching user who explains the PPT at the position as a visual resource to be embedded into a page of the current PPT, and generating a schematic diagram as shown in (a) in FIG. 9 according to an embedding result to finish explanation of the PPT content under the condition of shielding minimum content.
In sum, the image adding area is determined by adopting an image information density comparison mode, and the position with the smallest visual resource coverage information content can be selected for embedding, so that the problem that the viewing experience of the user is influenced by the shielding of the information is avoided, and the participation experience of the user can be further improved.
Furthermore, when the content of information included in the image to be processed is large, even if the image area with the minimum image information density in the image to be processed is selected for embedding the visualization resource, a certain amount of information is covered, and in order to facilitate the user to view all the information, the visualization resource may be transparently processed, which is specifically implemented in the embodiment as follows:
determining a transparentization level based on the image information density corresponding to the added image area; and performing transparentization processing on the visual resources according to the transparentization level, and adding the visual resources subjected to transparentization processing in the image adding area.
Specifically, the transparentization level specifically refers to a level representing the transparency of the visual resource, the higher the transparentization level is, the higher the transparency of the visual resource is, otherwise, the lower the transparentization level is, the lower the transparency of the visual resource is; the transparency level corresponds to the image information density, and the higher the image information density is, the more the information content of the description image area is, the higher the corresponding transparency level is, and the lower the image information density is, the less the information content of the description image area is, the lower the corresponding transparency level is.
Based on this, after the image adding region is determined, in order to avoid the visualization resource from shielding the information in the image to be processed, the transparentization level corresponding to the image information density of the image adding region may be determined according to the corresponding relationship between the image information density and the transparentization level, then the visualization resource is subjected to transparentization processing according to the level, and finally the visualization resource after the transparentization processing is embedded into the image adding region.
Along with the above example, in order to avoid shielding the PPT content by the lecture user avatar explaining the PPT, the transparentization level may be determined based on the image information density of the candidate image region corresponding to the lower right corner, then the lecture user avatar is subjected to transparentization processing according to the level, and finally the lecture user avatar subjected to transparentization processing is added to the image adding region, so that the schematic diagram shown in (b) in fig. 9 may be obtained, and shielding of the content in the PPT is avoided.
In conclusion, the visualized resources are processed in a transparent processing mode, so that the information in the image to be processed can be prevented from being shielded, the visualized resources can be embedded, and the viewing experience of a user is further improved.
In the image processing method provided by the present specification, after an image to be processed is obtained, in order to accurately calculate the image information density of each region in the image to be processed in the following step, the image to be processed may be first converted into a grayscale image, and then wavelet conversion is performed on the grayscale image to obtain high-frequency information of the image to be processed; and then, a high-frequency image corresponding to the image to be processed is created based on the high-frequency information, so that the image information density corresponding to each image area in the image to be processed is obtained in a high-frequency dimension, the image information density calculation accuracy of each image area is effectively ensured, the subsequent embedding of visual resources based on the image information density is facilitated, and the information in the image resources to be processed is prevented from being shielded.
Corresponding to the above method embodiment, the present specification further provides an image processing apparatus embodiment, and fig. 10 shows a schematic structural diagram of an image processing apparatus provided in an embodiment of the present specification. As shown in fig. 10, the apparatus includes:
an acquisition module 1002 configured to acquire an image to be processed;
a conversion module 1004 configured to convert the image to be processed into a grayscale image, and perform wavelet transformation on the grayscale image to obtain high-frequency information of the image to be processed;
a creating module 1006 configured to create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
and the processing module 1008 is configured to process the high-frequency image according to a preset image information density calculation strategy, and determine the image information density corresponding to each image area in the image to be processed according to the processing result.
In an optional embodiment, the high frequency information comprises at least one of:
horizontal high frequency information, vertical high frequency information, diagonal high frequency information.
In an optional embodiment, the creating module 1006 is further configured to:
calculating pixel information corresponding to each pixel point in the image to be processed based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information; and mapping the value of each pixel point in the image to be processed to a gray scale interval according to the pixel information, and creating the high-frequency image according to a mapping result.
In an optional embodiment, the processing module 1008 is further configured to:
analyzing the image information density calculation strategy, and determining a target sliding window and a target sliding step length according to an analysis result; moving the target sliding window in the high-frequency image according to the target sliding step length, and calculating the image information density corresponding to each high-frequency image area in the high-frequency image according to the moving result; and determining the image information density corresponding to each image area in the image to be processed based on the image information density corresponding to each high-frequency image area in the high-frequency image.
In an optional embodiment, the processing module 1008 is further configured to:
determining an initial sliding window and an initial sliding step length according to the analysis result, and determining the conversion relation between the image to be processed and the high-frequency image; and adjusting the initial sliding window and the initial sliding step length based on the conversion relation to obtain the target sliding window and the target sliding step length.
In an optional embodiment, the processing module 1008 is further configured to:
reading the pixel information density corresponding to each pixel point in a target image area of the image to be processed; calculating the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area of the high-frequency image according to the pixel information density corresponding to each pixel point in the target image area; and calculating the average value of the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area, and determining the image information density corresponding to the target high-frequency image area according to the calculation result.
In an optional embodiment, the processing module 1008 is further configured to:
establishing a mapping relation between each high-frequency image area in the high-frequency image and each image area in the image to be processed; and determining the image information density corresponding to each image area in the image to be processed based on the mapping relation and the image information density corresponding to each high-frequency image area.
In an optional embodiment, the image display apparatus further includes:
an adding module configured to select an image area corresponding to the minimum image information density as an added image area; and reading the visual resources matched with the image to be processed, and adding the visual resources to the image adding area.
In an optional embodiment, the adding module is further configured to:
determining a transparentization level based on the image information density corresponding to the added image area; and performing transparentization processing on the visual resources according to the transparentization level, and adding the visual resources subjected to transparentization processing in the image adding area.
After the image to be processed is obtained, in order to accurately calculate the image information density of each region in the image to be processed in the following step, the image to be processed may be first converted into a grayscale image, and then wavelet conversion is performed on the grayscale image to obtain high-frequency information of the image to be processed; and then, a high-frequency image corresponding to the image to be processed is created based on the high-frequency information, so that the image information density corresponding to each image area in the image to be processed is obtained in a high-frequency dimension, the image information density calculation accuracy of each image area is effectively ensured, the subsequent embedding of visual resources based on the image information density is facilitated, and the information in the image resources to be processed is prevented from being shielded.
The above is a schematic configuration of an image processing apparatus of the present embodiment. It should be noted that the technical solution of the image processing apparatus belongs to the same concept as the technical solution of the image processing method, and details that are not described in detail in the technical solution of the image processing apparatus can be referred to the description of the technical solution of the image processing method.
Fig. 11 is a flowchart illustrating a presentation method according to an embodiment of the present disclosure, which specifically includes the following steps:
step S1102, acquiring an image to be processed, and converting the image to be processed into a grayscale image.
And step S1104, performing wavelet transformation on the grayscale image to obtain high-frequency information of the image to be processed, and creating a high-frequency image corresponding to the image to be processed based on the high-frequency information.
Step S1106, processing the high-frequency image according to a preset image information density calculation strategy, and determining an image information density corresponding to an initial image area in the image to be processed according to a processing result.
Related description contents in the displaying method provided by this embodiment are similar to those in the image displaying method, and the same or similar description contents can refer to corresponding description contents in the image displaying method.
Specifically, the initial image area specifically refers to one or more image areas specified in the image to be processed, and by calculating the image information density of the initial image area, the area where the visualization resource needs to be added can be quickly determined, so that the embedding of the visualization resource can be conveniently completed on the basis of not blocking the information in the image to be processed.
Further, since the added positions of the visual resources in different service scenes are different, in order to avoid the problem that the position of the visual resource is changed randomly, the initial image region may be directly located from the image to be processed based on the service scene, in this embodiment, the specific implementation manner is as follows:
determining a target service corresponding to the image to be processed; loading an image area positioning strategy corresponding to the target service; and determining the initial image area in the area to be processed according to the image area positioning strategy.
For example, when explaining the PPT, in order to avoid that the head of a teaching user explaining the PPT generates a barrier to contents of the PPT, the lower left corner and the lower right corner of the PPT may be selected as initial image regions of a current PPT page, so that visual resources are added at the position for display in the following process.
Step S1108, a target image area is screened from the initial image area based on the image information density, and a visualization resource is added to the target image area in the image to be processed for displaying.
Specifically, after the image information density of the initial image region specified in the image to be processed is determined, further, in order to reduce the amount of the visualized resource coverage information, a target image region may be screened from the initial image region according to the image information density, and then, the visualized resource may be added to the target image region in the image to be processed to be displayed to the user.
It should be noted that, the image information density of each image region is calculated from the pixel points, so that when the visualization resource is an irregular region, the visualization resource with an irregular appearance can be added to the target image region, and the visualization resource is embedded into the target image region in the image to be processed, so as to display the content with a better effect to the user. In addition, the image information density can be calculated for irregular image regions, that is, when the initial image region in the image to be processed is irregular, the image information density of each image region can be calculated based on the image information density calculation method, so that the addition of irregular visualization resources is facilitated.
Further, in order to complete the addition of the visualization resource and avoid the occlusion of the information in the image to be processed, the target image area may be determined as follows:
comparing the image information density corresponding to each initial image area; and selecting an initial image area corresponding to the minimum image density information as the target image area according to the comparison result.
Furthermore, when the content of information included in the image to be processed is large, even if the image area with the minimum image information density in the image to be processed is selected for embedding the visualization resource, a certain amount of information is covered, and in order to facilitate the user to view all information, the visualization resource may be transparently processed, which is specifically implemented in the embodiment as follows:
determining the information density of the target image corresponding to the target image area; determining a target transparentization level corresponding to the target image information density based on the corresponding relation between the image information density and the transparentization level; and performing transparency processing on the visual resources according to the target transparency level, and adding the visual resources subjected to transparency processing to the target image area in the processed image for displaying.
Following the above example, referring to the schematic diagram shown in fig. 12, the image information density of the initial image area in the lower left corner is 0.0213, and the image information density of the initial image area in the lower right corner is 0.0136; then, the initial image area at the lower right corner is selected as the target image area, then, based on the corresponding relationship between the image information density and the transparentization level, the transparentization level corresponding to the image information density of 0.0136 of the target image area is determined to be X, then, the visualization resource is subjected to transparentization processing according to the transparentization level X, and then, the visualization resource after the transparentization processing is added to the target image area, so as to generate the schematic diagram shown in fig. 13.
In summary, after the image to be processed is obtained, in order to accurately calculate the image information density of each region in the image to be processed subsequently, the image to be processed may be converted into a grayscale image, and then wavelet conversion is performed on the grayscale image to obtain high-frequency information of the image to be processed; then, a high-frequency image corresponding to the image to be processed is created based on the high-frequency information, so that the image information density corresponding to each initial image area in the image to be processed is obtained in a high-frequency dimension, and the accuracy of image information density calculation of each initial image area is effectively guaranteed; and finally, screening out a target image area according to the image information density to add and display visual resources, so that the information in the image resources to be processed can be prevented from being shielded, and the viewing experience of a viewing user is improved.
Corresponding to the above method embodiment, the present specification further provides an embodiment of a display apparatus, and fig. 14 shows a schematic structural diagram of the display apparatus provided in the embodiment of the present specification. As shown in fig. 14, the apparatus includes:
an image acquiring module 1402 configured to acquire an image to be processed, and convert the image to be processed into a grayscale image;
an image creating module 1404 configured to perform wavelet transformation on the grayscale image to obtain high-frequency information of the image to be processed, and create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
the image processing module 1406 is configured to process the high-frequency image according to a preset image information density calculation strategy, and determine the image information density corresponding to the initial image area in the image to be processed according to the processing result;
a screening area module 1408 configured to screen a target image area in the initial image area based on the image information density, and add a visualization resource to the target image area in the image to be processed for presentation.
In an alternative embodiment, the initial image area is determined by:
determining a target service corresponding to the image to be processed; loading an image area positioning strategy corresponding to the target service; and determining the initial image area in the area to be processed according to the image area positioning strategy.
In an optional embodiment, the screening area module 1408 is further configured to:
comparing the image information density corresponding to each initial image area; and selecting an initial image area corresponding to the minimum image density information as the target image area according to the comparison result.
In an optional embodiment, the screening area module 1408 is further configured to:
determining the information density of the target image corresponding to the target image area; determining a target transparentization level corresponding to the target image information density based on the corresponding relation between the image information density and the transparentization level; and performing transparency processing on the visual resources according to the target transparency level, and adding the visual resources subjected to transparency processing to the target image area in the processed image for displaying.
In summary, after the image to be processed is obtained, in order to accurately calculate the image information density of each region in the image to be processed subsequently, the image to be processed may be converted into a grayscale image, and then wavelet conversion is performed on the grayscale image to obtain high-frequency information of the image to be processed; then, a high-frequency image corresponding to the image to be processed is created based on the high-frequency information, so that the image information density corresponding to each initial image area in the image to be processed is obtained in a high-frequency dimension, and the accuracy of image information density calculation of each initial image area is effectively guaranteed; and finally, screening out a target image area according to the image information density to add and display visual resources, so that the information in the image resources to be processed can be prevented from being shielded, and the viewing experience of a viewing user is improved.
The above is a schematic scheme of a display device of the present embodiment. It should be noted that the technical solution of the display apparatus and the technical solution of the display method belong to the same concept, and details that are not described in detail in the technical solution of the display apparatus can be referred to the description of the technical solution of the display method.
FIG. 15 illustrates a block diagram of a computing device 1500 provided in accordance with an embodiment of the present description. The components of the computing device 1500 include, but are not limited to, a memory 1510 and a processor 1520. The processor 1520 is coupled to the memory 1510 via a bus 1530 and a database 1550 is used to store data.
The computing device 1500 also includes an access device 1540 that enables the computing device 1500 to communicate via one or more networks 1560. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 1540 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present description, the above-described components of computing device 1500, as well as other components not shown in FIG. 15, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device structure shown in FIG. 15 is for purposes of example only and is not limiting as to the scope of the description. Those skilled in the art may add or replace other components as desired.
Computing device 1500 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smartphone), wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 1500 may also be a mobile or stationary server.
The processor 1520 is used, among other things, to execute computer-executable instructions such as an image processing method or a presentation method.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device belongs to the same concept as the technical solution of the image processing method or the presentation method, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the image processing method or the presentation method.
An embodiment of the present specification also provides a computer-readable storage medium storing computer instructions, which when executed by a processor, are used for an image processing method or a presentation method.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the image processing method or the presentation method, and for details that are not described in detail in the technical solution of the storage medium, reference may be made to the description of the technical solution of the image processing method or the presentation method.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the foregoing method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present disclosure is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present disclosure. Further, those skilled in the art should also appreciate that the embodiments described in this specification are preferred embodiments and that acts and modules referred to are not necessarily required for this description.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present specification disclosed above are intended only to aid in the description of the specification. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the specification and its practical application, to thereby enable others skilled in the art to best understand the specification and its practical application. The specification is limited only by the claims and their full scope and equivalents.

Claims (17)

1. An image processing method, comprising:
acquiring an image to be processed;
converting the image to be processed into a gray image, and performing wavelet transformation on the gray image to obtain high-frequency information of the image to be processed;
creating a high-frequency image corresponding to the image to be processed based on the high-frequency information;
and processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to the processing result, wherein the image information density calculation strategy is a strategy of moving in the high-frequency image according to a preset sliding window and a preset step length and calculating the image information density of the corresponding area of the sliding window in the high-frequency image after each movement.
2. The image processing method according to claim 1, wherein the high frequency information includes at least one of:
horizontal high frequency information, vertical high frequency information, diagonal high frequency information.
3. The image processing method according to claim 2, wherein the creating a high-frequency image corresponding to the image to be processed based on the high-frequency information comprises:
calculating pixel information corresponding to each pixel point in the image to be processed based on the horizontal high-frequency information, the vertical high-frequency information and the diagonal high-frequency information;
and mapping the value of each pixel point in the image to be processed to a gray scale interval according to the pixel information, and creating the high-frequency image according to a mapping result.
4. The image processing method according to claim 1, wherein the processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to each image area in the image to be processed according to the processing result comprises:
analyzing the image information density calculation strategy, and determining a target sliding window and a target sliding step length according to an analysis result;
moving the target sliding window in the high-frequency image according to the target sliding step length, and calculating the image information density corresponding to each high-frequency image area in the high-frequency image according to the moving result;
and determining the image information density corresponding to each image area in the image to be processed based on the image information density corresponding to each high-frequency image area in the high-frequency image.
5. The image processing method according to claim 4, wherein the determining a target sliding window and a target sliding step size according to the analysis result comprises:
determining an initial sliding window and an initial sliding step length according to the analysis result, and determining the conversion relation between the image to be processed and the high-frequency image;
and adjusting the initial sliding window and the initial sliding step length based on the conversion relation to obtain the target sliding window and the target sliding step length.
6. The image processing method according to claim 4, wherein the image information density corresponding to any one of the high-frequency image regions is determined as follows:
reading the pixel information density corresponding to each pixel point in a target image area of the image to be processed;
calculating the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area of the high-frequency image according to the pixel information density corresponding to each pixel point in the target image area;
and calculating the average value of the high-frequency pixel information density corresponding to each pixel point in the target high-frequency image area, and determining the image information density corresponding to the target high-frequency image area according to the calculation result.
7. The image processing method according to claim 4, wherein the determining the image information density corresponding to each image area in the image to be processed based on the image information density corresponding to each high-frequency image area in the high-frequency image comprises:
establishing a mapping relation between each high-frequency image area in the high-frequency image and each image area in the image to be processed;
and determining the image information density corresponding to each image area in the image to be processed based on the mapping relation and the image information density corresponding to each high-frequency image area.
8. The image processing method according to any one of claims 1 to 7, wherein after the step of determining the image information density corresponding to each image area in the image to be processed according to the processing result is executed, the method further comprises:
selecting an image area corresponding to the minimum image information density as an added image area;
and reading the visual resources matched with the image to be processed, and adding the visual resources to the image adding area.
9. The image processing method of claim 8, wherein the adding the visualization resource to the added image area comprises:
determining a transparentization level based on the image information density corresponding to the added image area;
and performing transparentization processing on the visual resources according to the transparentization level, and adding the visual resources subjected to transparentization processing in the image adding area.
10. An image processing apparatus characterized by comprising:
an acquisition module configured to acquire an image to be processed;
the conversion module is configured to convert the image to be processed into a gray image and perform wavelet transformation on the gray image to obtain high-frequency information of the image to be processed;
the creating module is configured to create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
and the processing module is configured to process the high-frequency image according to a preset image information density calculation strategy, and determine the image information density corresponding to each image area in the image to be processed according to the processing result, wherein the image information density calculation strategy is a strategy of moving in the high-frequency image according to a preset sliding window and a preset step length and calculating the image information density of the sliding window in the area corresponding to the high-frequency image after each movement.
11. A method of displaying, comprising:
acquiring an image to be processed, and converting the image to be processed into a gray image;
performing wavelet transformation on the gray level image to obtain high-frequency information of the image to be processed, and creating a high-frequency image corresponding to the image to be processed based on the high-frequency information;
processing the high-frequency image according to a preset image information density calculation strategy, and determining the image information density corresponding to an initial image area in the image to be processed according to a processing result, wherein the image information density calculation strategy is a strategy of moving in the high-frequency image according to a preset sliding window and a preset step length and calculating the image information density of the sliding window in the area corresponding to the high-frequency image after each movement;
and screening a target image area in the initial image area based on the image information density, and adding visual resources to the target image area in the image to be processed for displaying.
12. A presentation method as claimed in claim 11, wherein said initial image area is determined by:
determining a target service corresponding to the image to be processed;
loading an image area positioning strategy corresponding to the target service;
and determining the initial image area in the area to be processed according to the image area positioning strategy.
13. The presentation method of claim 11, wherein the screening of the initial image region for a target image region based on the image information density comprises:
comparing the image information density corresponding to each initial image area;
and selecting an initial image area corresponding to the minimum image density information as the target image area according to the comparison result.
14. The method according to any one of claims 11 to 13, wherein adding a visualization resource to the target image area in the image to be processed for displaying comprises:
determining the information density of the target image corresponding to the target image area;
determining a target transparentization level corresponding to the target image information density based on the corresponding relation between the image information density and the transparentization level;
and performing transparency processing on the visual resources according to the target transparency level, and adding the visual resources subjected to transparency processing to the target image area in the processed image for displaying.
15. A display device, comprising:
the image acquisition module is configured to acquire an image to be processed and convert the image to be processed into a gray image;
the image creating module is configured to perform wavelet transformation on the grayscale image to obtain high-frequency information of the image to be processed, and create a high-frequency image corresponding to the image to be processed based on the high-frequency information;
the image processing module is configured to process the high-frequency image according to a preset image information density calculation strategy, and determine the image information density corresponding to an initial image area in the image to be processed according to a processing result, wherein the image information density calculation strategy is a strategy of moving in the high-frequency image according to a preset sliding window and a preset step length and calculating the image information density of the sliding window in the area corresponding to the high-frequency image after each movement;
and the screening area module is configured to screen a target image area in the initial image area based on the image information density, and add visual resources to the target image area in the image to be processed for displaying.
16. A computing device, comprising:
a memory and a processor;
the memory is for storing computer-executable instructions, and the processor is for executing the computer-executable instructions to implement the steps of the method of any one of claims 1 to 9 or 11 to 14.
17. A computer-readable storage medium storing computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 9 or 11 to 14.
CN202111218339.5A 2021-10-20 2021-10-20 Image processing method and device Active CN113658085B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111218339.5A CN113658085B (en) 2021-10-20 2021-10-20 Image processing method and device
PCT/CN2022/083953 WO2023065604A1 (en) 2021-10-20 2022-03-30 Image processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111218339.5A CN113658085B (en) 2021-10-20 2021-10-20 Image processing method and device

Publications (2)

Publication Number Publication Date
CN113658085A CN113658085A (en) 2021-11-16
CN113658085B true CN113658085B (en) 2022-02-01

Family

ID=78484255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111218339.5A Active CN113658085B (en) 2021-10-20 2021-10-20 Image processing method and device

Country Status (2)

Country Link
CN (1) CN113658085B (en)
WO (1) WO2023065604A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658085B (en) * 2021-10-20 2022-02-01 北京优幕科技有限责任公司 Image processing method and device
CN113938752A (en) * 2021-11-30 2022-01-14 联想(北京)有限公司 Processing method and device
CN115576358B (en) * 2022-12-07 2023-03-10 西北工业大学 Unmanned aerial vehicle distributed control method based on machine vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307972A1 (en) * 2013-04-12 2014-10-16 Megachips Corporation Image processing apparatus and image processing method
CN110348459A (en) * 2019-06-28 2019-10-18 西安理工大学 Based on multiple dimensioned quick covering blanket method sonar image fractal characteristic extracting method
CN111652854A (en) * 2020-05-13 2020-09-11 中山大学 No-reference image quality evaluation method based on image high-frequency information
CN112270271A (en) * 2020-10-31 2021-01-26 重庆商务职业学院 Iris identification method based on wavelet packet decomposition
CN113515978A (en) * 2020-04-16 2021-10-19 阿里巴巴集团控股有限公司 Data processing method, device and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009047833A1 (en) * 2007-10-09 2009-04-16 C4 Technology, Inc. Electronic watermark embedding method, electronic watermark embedding apparatus, program, and computer-readable recording medium
CN103489170B (en) * 2013-09-05 2017-01-11 浙江宇视科技有限公司 Method and device for JPEG picture synthesis and OSD information superimposition
CN113658085B (en) * 2021-10-20 2022-02-01 北京优幕科技有限责任公司 Image processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307972A1 (en) * 2013-04-12 2014-10-16 Megachips Corporation Image processing apparatus and image processing method
CN110348459A (en) * 2019-06-28 2019-10-18 西安理工大学 Based on multiple dimensioned quick covering blanket method sonar image fractal characteristic extracting method
CN113515978A (en) * 2020-04-16 2021-10-19 阿里巴巴集团控股有限公司 Data processing method, device and storage medium
CN111652854A (en) * 2020-05-13 2020-09-11 中山大学 No-reference image quality evaluation method based on image high-frequency information
CN112270271A (en) * 2020-10-31 2021-01-26 重庆商务职业学院 Iris identification method based on wavelet packet decomposition

Also Published As

Publication number Publication date
WO2023065604A1 (en) 2023-04-27
CN113658085A (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN113658085B (en) Image processing method and device
US20210099706A1 (en) Processing of motion information in multidimensional signals through motion zones and auxiliary information through auxiliary zones
Huo et al. Physiological inverse tone mapping based on retina response
US9858652B2 (en) Global approximation to spatially varying tone mapping operators
Li et al. Visual-salience-based tone mapping for high dynamic range images
US10402941B2 (en) Guided image upsampling using bitmap tracing
CN107622504B (en) Method and device for processing pictures
CN102741883A (en) High dynamic range image generation and rendering
JP6623832B2 (en) Image correction apparatus, image correction method, and computer program for image correction
EP2202683A1 (en) Image generation method, device, its program and recording medium with program recorded therein
CN110335330B (en) Image simulation generation method and system, deep learning algorithm training method and electronic equipment
US20150016717A1 (en) Opacity Measurement Using A Global Pixel Set
KR101215666B1 (en) Method, system and computer program product for object color correction
Tsai et al. Adaptive fuzzy color interpolation
KR102583038B1 (en) Directional scaling systems and methods
US9154671B2 (en) Image processing apparatus, image processing method, and program
CN112927200B (en) Intrinsic image decomposition method and device, readable storage medium and electronic equipment
WO2020241337A1 (en) Image processing device
US20170206637A1 (en) Image correction apparatus and image correction method
JP6155349B2 (en) Method, apparatus and computer program product for reducing chromatic aberration in deconvolved images
CN110874816B (en) Image processing method, device, mobile terminal and storage medium
US20200043137A1 (en) Statistical noise estimation systems and methods
JP2020021329A (en) Image processor
Guo et al. Opt2Ada: an universal method for single-image low-light enhancement
US20220383450A1 (en) Image processing apparatus and operating method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant