CN110853084A - Image adaptation method based on edge vision protection - Google Patents

Image adaptation method based on edge vision protection Download PDF

Info

Publication number
CN110853084A
CN110853084A CN201911122160.2A CN201911122160A CN110853084A CN 110853084 A CN110853084 A CN 110853084A CN 201911122160 A CN201911122160 A CN 201911122160A CN 110853084 A CN110853084 A CN 110853084A
Authority
CN
China
Prior art keywords
point
energy
line
edge
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911122160.2A
Other languages
Chinese (zh)
Other versions
CN110853084B (en
Inventor
刘庆芳
刘海云
王月春
李冀东
刘涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shijiazhuang Vocational And Technical College Of Posts And Telecommunications (training Center Of China Post Group Company)
Original Assignee
Shijiazhuang Vocational And Technical College Of Posts And Telecommunications (training Center Of China Post Group Company)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shijiazhuang Vocational And Technical College Of Posts And Telecommunications (training Center Of China Post Group Company) filed Critical Shijiazhuang Vocational And Technical College Of Posts And Telecommunications (training Center Of China Post Group Company)
Priority to CN201911122160.2A priority Critical patent/CN110853084B/en
Publication of CN110853084A publication Critical patent/CN110853084A/en
Application granted granted Critical
Publication of CN110853084B publication Critical patent/CN110853084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image adaptation method based on edge vision protection, which comprises the following steps: s1, calculating an importance map: calculating a weighted importance map according to the gradient map, the saliency map and the edge line; s2, optimizing an energy map: taking the importance map as a basic energy map, and optimizing the energy map by using the visual characteristics of the inclination degree and the bending degree of the lines on the basis; s3, line cutting adaptation: and performing forward line cutting operation according to the optimized energy diagram to realize image adaptation. On the premise of ensuring the display effect of the remarkable object, the invention improves the continuity and fluency of the edge and the outline by keeping the shape of the edge outline of the non-remarkable area.

Description

Image adaptation method based on edge vision protection
Technical Field
The invention relates to an image adaptation method, in particular to an image adaptation method based on edge vision protection.
Background
Currently, with the rapid development of computer-aided diagnosis technology, a variety of medical imaging technologies are clinically developed to obtain medical images of internal tissues and organs of human body, such as Computed Tomography (CT), Nuclear Magnetic Resonance Imaging (NMRI), medical Ultrasound Imaging (UI), and the like. However, the pixels of these medical images are fixed or set only once, so that the images displayed on different display devices (such as computer screens, projection, notebook computers, tablet computers, smart phones, etc.) need to be adjusted to adapt to display attributes such as different resolutions and aspect ratios. The process of image adjustment is also referred to as image redirection or image display adaptation. The traditional image adaptation method mainly comprises unified zooming and direct cropping, has the advantages of simplicity and directness, but cannot achieve satisfactory visual effect. The reason is that the difference in resolution and aspect ratio before and after scaling can cause distortion, distortion or loss of important contents of an image.
In order to solve this problem, the scholars propose an image adaptation technology based on content perception, which divides an image into important parts and non-important parts, preferentially maintains the integrity and display scale of the important parts, and distributes deformation and distortion to the non-important parts as much as possible. Currently, most studies focus only on the display quality of salient regions, while the zoom effect (distortion or deformation) and display quality for non-salient regions are less focused. However, the human visual system is sensitive to the edge contour, and excessive deformation and misalignment of the contour of the non-important region may cause display defects, which affect the visual display effect of the zoomed image.
The traditional adaptation method only considers the original size and the target size of the image and does not consider the content in the image, and although the adaptation efficiency is high, the adaptation effect is not satisfactory. The traditional adaptation method mainly comprises unified scaling and direct cropping. (1) Unified scaling: establishing a uniform mapping relation between the pixels of the original image and the pixels of the target image by an interpolation method, wherein the three most common interpolation methods are as follows: nearest neighbor interpolation, bilinear interpolation and cubic interpolation have the advantages of high efficiency, real-time processing, and incapability of maintaining the aspect ratio when the aspect ratio of an image changes, thereby causing content distortion. (2) And (3) direct cutting: the method is simple, and important contents close to the edge can be cut off.
Content-aware based adaptation has received a great deal of attention from researchers in recent years. In the process of adaptation, the image content is divided into an important part and a non-important part, and the non-important part is cut or stretched preferentially, so that the shape of the important part is kept unchanged as much as possible. The basis for measuring the importance of the image content comprises gradient, significance and the like.
Content-aware based adaptation is mainly discrete, continuous and multi-operational.
The discrete adaptation method is mainly wired clipping. The line cropping adaptation method inserts or deletes the 8-channel path with the minimum energy in the input image to realize image adaptation. Avidan and Shamir first proposed the concept of line clipping to perform image adaptation by inserting or deleting pixel paths by defining energy minimization equations in the image gradient map. Rubinstein et al use Graph Cut method to find the path with the minimum energy, and consider the newly added energy after deleting the pixel, and propose a forward line clipping method to distinguish from backward clipping considering only the energy of the pixel itself. Wang et al utilize super-resolution methods to improve image scaling quality.
The continuous adaptation method mainly includes mesh deformation and spring deformation. The main idea of grid deformation is to divide an input image into grid sets, calculate the importance degree of each grid according to the importance degree of image content, make deformation occur in grid areas with low importance degree as much as possible in the adaptation process, and keep the grid areas with high importance degree unchanged or proportion. The scholars use different mesh shapes to divide the input image, such as triangles, rectangles, curved trapezoids, etc. The spring deformation is an image adaptation method based on a spring deformation model, edges of a grid are regarded as springs, a spring system is constructed, and image adaptation is realized by optimizing the spring system.
The multi-operation adaptation method comprehensively utilizes the combination of specific sequence and quantity of methods such as line cutting, unified zooming, unified cutting and the like to achieve relatively better results when a single method fails, has lower efficiency and cannot fundamentally enhance the effect.
The above method focuses mainly on shape preservation of important regions, while less attention is paid to image quality of non-important regions.
Disclosure of Invention
The invention mainly aims to provide an image adaptation method based on edge vision protection.
The technical scheme adopted by the invention is as follows: an image adaptation method based on edge vision protection comprises the following steps:
s1, calculating an importance map: calculating a weighted importance map according to the gradient map, the saliency map and the edge line;
s2, optimizing an energy map: taking the importance map as a basic energy map, and optimizing the energy map by using the visual characteristics of the inclination degree and the bending degree of the lines on the basis;
s3, line cutting adaptation: and performing forward line cutting operation according to the optimized energy diagram to realize image adaptation.
Further, the step S1 includes:
gradient calculation:
firstly, converting an RGB color image into a gray-scale image, and then calculating the gradient of an input image by using a Sobel operator; let Wh be the horizontal 3 × 3 operator and Wv be the vertical 3 × 3 operator, see equation (1):
Figure 100002_DEST_PATH_IMAGE001
(1)
gradient of gradient imageFormula (2) in the formula (2):
Figure 100002_DEST_PATH_IMAGE003
(2)
because of the large computational overhead of the square and square root, the present invention approximates the magnitude of the gradient by the sum of absolute values, see equation (3):
Figure DEST_PATH_IMAGE004
(3);
and (3) calculating a saliency map:
for medical images, a significant part is obtained by using a combination of morphological methods, or an interested organ region is segmented by using a region growing, probability or statistical model set and graph cutting method;
edge line calculation:
computing edge lines in an image using Canny edge detection, smoothing the input image with a gaussian filter having a standard deviation of 2 to reduce noise, and using Canny edge detection to calculate the edge lines in the image
Figure 100002_DEST_PATH_IMAGE005
Which represents the input image, is,
Figure DEST_PATH_IMAGE006
representing a Gaussian function, smoothed images
Figure 100002_DEST_PATH_IMAGE007
The calculation formula is as the formula (4):
Figure DEST_PATH_IMAGE008
(4)
wherein denotes convolution; calculating the amplitude and direction of each point, the calculation formulas of the amplitude and direction are respectively shown as (5) and (6),
(5)
Figure DEST_PATH_IMAGE010
(6)
wherein:
Figure 100002_DEST_PATH_IMAGE011
(ii) a Refining edges by non-maximum suppression, i.e. only retaining the point with maximum amplitude in the direction of edge normal, the amplitude of a certain point
Figure DEST_PATH_IMAGE012
Less than near the edge normalMagnitude of 2 neighbors of direction
Figure 786653DEST_PATH_IMAGE012
Setting the value of the point to zero, namely, the point is used as a non-maximum edge point to be suppressed; detecting and connecting edges by using double-threshold processing and connection analysis, directly taking the points higher than a high threshold as an edge P, directly discarding the points lower than a low threshold, and processing the points between the two thresholds, connecting to the P by using an 8-way method, and discarding the points which cannot be connected to the P;
and (3) weighted generation of an importance graph:
calculating an importance map by weighting the calculated gradient map, saliency map, and edge line map, and using the importance map
Figure DEST_PATH_IMAGE014
For representing, significance binary maps
Figure 100002_DEST_PATH_IMAGE015
Representing, normalizing the gradient map
Figure DEST_PATH_IMAGE016
Representation, line binary diagram
Figure 100002_DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE018
The calculation formula (2) is shown as (7); in the region of the salient region(s),
Figure 849418DEST_PATH_IMAGE018
take a value of
Figure 100002_DEST_PATH_IMAGE019
(Algorithm of the invention
Figure DEST_PATH_IMAGE020
) (ii) a At the edge lines of the non-salient regions
Figure 959325DEST_PATH_IMAGE018
Take a value of
Figure 100002_DEST_PATH_IMAGE021
(Algorithm of the invention) (ii) a Taking values in other regions as a normalized gradient map(Algorithm of the invention
Figure DEST_PATH_IMAGE024
);
Figure 100002_DEST_PATH_IMAGE025
(7)
Equation (7) can also be calculated by the maximum value equation, as shown in equation (8),
Figure DEST_PATH_IMAGE026
(8)
weight coefficient
Figure DEST_PATH_IMAGE027
Further, the step S2 includes:
the lines of the non-salient region are regarded as object objects by a binary image boundary tracking method, and the edge lines of the non-salient region are changed into an ordered point coordinate sequence so as to facilitate the processing of a subsequent method; let NL be the number of points of the lth curve in the image, and the line with NL not higher than the threshold NT (NT =20 in the present invention) is not easy to attract the attention of human vision, and the image adaptation effect is small, and the energy of these points is directly set as low energy, as shown in formula (7):
Figure DEST_PATH_IMAGE028
,s.t. N≤20 (9)
calculating the slope of the edge:
setting the coordinate of the point a on the edge line of the L-th edge as (Xa, Ya), a is more than or equal to 1 and less than or equal to NL, and the NL is the number of the points on the edge line of the L-th edge at present;
the first-order difference at a certain point is the slope of a tangent line at a certain point on the curve, and the steepness of the tangent line at the point is reflected;
the slope at point (Xa, Ya) is defined as:
, s.t. 2≤a≤NL-1 (10)
because the image pixels are discrete points, the slope and the curvature are directly calculated by using 1 or 2 adjacent points, the jump of the result can be caused, and the integral trend of the line cannot be reflected, so that the calculation is carried out by adopting a multi-point weighting method in the process of calculating the slope and the curvature; taking adjacent 2 x i points (i =3) of a certain edge point to calculate the slope of a tangent line of the curve; the slope definition based on the multipoint weighting at point (Xa, Ya) is shown in equation (9):
Figure DEST_PATH_IMAGE030
, s.t. 4≤a≤NL-3 (11)
the weight coefficients w1=3, w2=2, w3= 1;
calculate the curvature of the edge:
the curvature reflects the degree of curvature of the curve at that pointCurvature of a point on a curve
Figure DEST_PATH_IMAGE032
Express, curvature
Figure 502040DEST_PATH_IMAGE032
Is shown as (12), wherein
Figure 100002_DEST_PATH_IMAGE033
Is the differential of the change in the angle of the tangent to the curve,
Figure DEST_PATH_IMAGE034
is the differential of the curve arc length;
Figure 100002_DEST_PATH_IMAGE035
(12)
based on the slope calculated in the above steps, the included angle between the tangent line at the point (Xa, Ya) and the horizontal line can be calculated by using the arctangent function
Figure DEST_PATH_IMAGE036
Can be used as the tangent angle change differential in the curvature calculation formula (12); the Euclidean distance between the two points (Xa +1, Ya +1) and (Xa, Ya) can be used as the differential of the arc length in the curvature calculation formula; thus, the curvature approximation calculation formula (13) can be obtained:
Figure 100002_DEST_PATH_IMAGE037
, s.t. 4≤a≤NL-4 (13)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE038
the curvature of the a-th point on the curve L, (Xa +1, Ya +1) and (Xa, Ya) are the coordinates of the a +1 point and the adjacent point a on the curve L respectively;
since the value range of the arctangent function is
Figure 100002_DEST_PATH_IMAGE039
When the edge is excessively bent, the difference of the slope arctangent functions cannot correctly reflect the included angle of the two tangent linesThe invention uses the cosine theorem to calculate the included angle between the connecting line of the coordinate weighting (Xa-, Ya-) of the current point (Xa, Ya) and the coordinate weighting (Xa +, Ya +) of the point i (i =3) before the nearest neighbor
Figure 100002_DEST_PATH_IMAGE041
And use of
Figure 421367DEST_PATH_IMAGE041
It is determined whether the line at that point is excessively curved,
Figure 943615DEST_PATH_IMAGE041
the calculation formula (2) is shown as (10):
(14)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE044
Figure 100002_DEST_PATH_IMAGE045
,
the weight coefficients w1=3, w2=2, w3= 1;
in the next step, it will use
Figure 767083DEST_PATH_IMAGE041
As a basis for judging whether the curve is excessively bent;
energy optimization:
in the process of calculating the column seams (the same as the row seams), for a point on the line, the curvature (the bending degree) of which is greater than the threshold value, the curve bending degree near the point is greater, if the engraving line passes through the point, the point is deleted, so that the line is interrupted and jumps violently, the higher energy of the point is kept unchanged, the engraving seam is blocked from passing through the area of the line, and the generation of larger visual distortion is prevented;
step S21: the invention judges whether the curve is excessively bent or not by utilizing the curvature of the line, and when the curve is excessively bent, the invention can judge whether the curve is excessively bent or not
Figure 100002_DEST_PATH_IMAGE047
Greater than a threshold valueAnd is and
Figure 11857DEST_PATH_IMAGE041
less than thresholdWhen the energy is consumed, the curve at the point is bent to a greater degree, and higher energy is kept unchanged; otherwise, the curve bending degree near the point is small, and the step 2 is carried out;
step S22: if the absolute value of the slopeIs greater than
Figure 100002_DEST_PATH_IMAGE051
When the crack passes through the point, the line at the point is steep, and if the crack passes through the point, the point can be deleted after the crack deletion operation is carried out, so that the large visual distortion can be caused, and the high energy of the point is kept unchanged; otherwise, the curve degree of the line near the point is small and relatively gentle, and the step S is carried out23;
Step S23: the step of judging the smoothness of the vicinity of the point according to the slope if the absolute value of the slope
Figure 225670DEST_PATH_IMAGE050
Is greater than
Figure DEST_PATH_IMAGE052
Will cycle
Figure 100002_DEST_PATH_IMAGE053
Setting to 3, and then turning to step 4 to optimize the edge energy; otherwise, setting the period to be 2 to optimize the edge energy, and turning to the step 4 to optimize the energy;
step S24:
this step optimizes the energy by adjusting the energy level of the points, periodically at intervals
Figure 613926DEST_PATH_IMAGE053
Reducing the energy of the point (Xa, Ya) on the line and the energy of the adjacent points (Xa +1, Ya) and (Xa-1, Ya), see formula (15), optimizing the energy of the previous point to be
Figure DEST_PATH_IMAGE054
The energy of the optimized point is
Figure 100002_DEST_PATH_IMAGE055
(ii) a Slope of tangent line
Figure DEST_PATH_IMAGE056
The smaller the area the more gradual the line is, the greater the degree of energy reduction is when
Figure 100002_DEST_PATH_IMAGE057
When the energy is reduced to 1/2, when the energy is reduced to the original energy
Figure DEST_PATH_IMAGE058
When the energy is not changed, the calculation formula is shown as (15):
(15)
in order to keep the shape integrity and continuity of the line, the energy on the line is periodically and uniformly fluctuated, and the cutting line is guided to uniformly penetrate through the line; period of points in which energy is optimized
Figure DEST_PATH_IMAGE060
Changing with the change of the tangent slope of the point, specifically as shown in equation (16), where a is the serial number of the point in the edge point sequence; when the absolute K is less than 1/2 and is more than or equal to 0,
Figure 793366DEST_PATH_IMAGE060
=2, i.e. the line is optimized at 1 point every 1 point; when 1/2 ≦ K<When the pressure of the mixture is 1, the pressure is lower,
Figure 796657DEST_PATH_IMAGE060
=3, i.e. the line is optimized at 1 point every 2 points;
{ if a mod λ equals 0 } , s.t. 4≤a≤NL-4, (16)。
further, the step S3 includes:
assuming that the energy of a point (i, j) in the normalized energy map is e (i, j), the backward cumulative energy M (i, j) is the minimum value of the energy of the point and the cumulative energy of 3 points adjacent to the point in the previous row, and the calculation formula of M (i, j) is as follows:
Figure DEST_PATH_IMAGE062
, (17)
the forward accumulated energy is obtained by adding a term C (i, j) on the basis of backward energy, wherein the term C (i, j) is energy generated when a non-adjacent pixel point becomes a new adjacent pixel point after the pixel is deleted, and a calculation formula of the term C (i, j) is as follows:
Figure 100002_DEST_PATH_IMAGE063
(18)
Figure DEST_PATH_IMAGE064
(19)
the process of calculating the accumulated energy is that each point of the line is traversed from the second line, the energy of the point and the minimum value of the accumulated energy of the 8 field points in the previous line are added to be taken as the accumulated energy of the point, and the accumulated energy is calculated to the last line; after the accumulated energy is calculated, backtracking is carried out from the last line by using a dynamic programming method, and the pixel point with the minimum accumulated energy in the 8 fields in the adjacent last line is searched until the first line.
The invention has the advantages that:
the image adaptive line cutting method based on the contour preservation innovatively combines the gradient map, the saliency map and the edge map into the importance map in a weighted mode, and simultaneously optimizes the energy of an insignificant region according to the slope and the curvature of a contour line, so that the cutting line can more uniformly penetrate through the contour line region, and the distortion and the interruption of the contour are reduced. By applying the method, the contour integrity and continuity of the non-salient region can be maintained on the premise of protecting the visual effect of the salient region of the image, and the display quality of the non-salient region and the whole image is further improved. Experiments performed on published data sets show that the method proposed by the present invention is significantly better than the comparative method.
On the premise of ensuring the display effect of the remarkable object, the invention improves the continuity and fluency of the edge and the outline by keeping the shape of the edge outline of the non-remarkable area. Experiments show that the method provided by the invention can reduce visual deviation, improve the display effect of the non-significant area and finally improve the overall display effect of the image.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below with reference to the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention.
FIG. 1 is a schematic flow chart of the algorithm of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, as shown in fig. 1, an image adaptation method based on edge vision protection includes the following steps:
s1, calculating an importance map: calculating a weighted importance map according to the gradient map, the saliency map and the edge line;
s2, optimizing an energy map: taking the importance map as a basic energy map, and optimizing the energy map by using the visual characteristics of the inclination degree and the bending degree of the lines on the basis;
s3, line cutting adaptation: and performing forward line cutting operation according to the optimized energy diagram to realize image adaptation.
The step S1 includes:
gradient calculation:
firstly, converting an RGB color image into a gray-scale image, and then calculating the gradient of an input image by using a Sobel operator; let Wh be the horizontal 3 × 3 operator and Wv be the vertical 3 × 3 operator, see equation (1):
Figure 966607DEST_PATH_IMAGE001
(1)
gradient of gradient image
Figure 830658DEST_PATH_IMAGE002
Formula (2) in the formula (2):
Figure 972927DEST_PATH_IMAGE003
(2)
because of the large computational overhead of the square and square root, the present invention approximates the magnitude of the gradient by the sum of absolute values, see equation (3):
Figure 529810DEST_PATH_IMAGE004
(3) ;
and (3) calculating a saliency map:
for medical images, a significant part is obtained by using a combination of morphological methods, or an interested organ region is segmented by using a region growing, probability or statistical model set and graph cutting method;
edge line calculation:
computing edge lines in an image using Canny edge detection, smoothing the input image with a gaussian filter having a standard deviation of 2 to reduce noise, and using Canny edge detection to calculate the edge lines in the image
Figure 891783DEST_PATH_IMAGE005
Which represents the input image, is,
Figure 559525DEST_PATH_IMAGE006
representing a Gaussian function, smoothed images
Figure 290721DEST_PATH_IMAGE007
The calculation formula is as the formula (4):
Figure 18505DEST_PATH_IMAGE008
(4)
wherein denotes convolution; calculating the amplitude and direction of each point, the calculation formulas of the amplitude and direction are respectively shown as (5) and (6),
Figure 366310DEST_PATH_IMAGE009
(5)
Figure 572163DEST_PATH_IMAGE010
(6)
wherein:
Figure 689024DEST_PATH_IMAGE011
(ii) a Refining edges by non-maximum suppression, i.e. only retaining the point with maximum amplitude in the direction of edge normal, the amplitude of a certain point
Figure 587710DEST_PATH_IMAGE012
Less than near the edge normal
Figure 655767DEST_PATH_IMAGE013
Magnitude of 2 neighbors of direction
Figure 665311DEST_PATH_IMAGE012
Setting the value of the point to zero, namely, the point is used as a non-maximum edge point to be suppressed; detecting and connecting edges by using double-threshold processing and connection analysis, directly taking the points higher than a high threshold as an edge P, directly discarding the points lower than a low threshold, and processing the points between the two thresholds, connecting to the P by using an 8-way method, and discarding the points which cannot be connected to the P;
and (3) weighted generation of an importance graph:
calculating an importance map by weighting the calculated gradient map, saliency map, and edge line map, and using the importance mapFor representing, significance binary maps
Figure 706265DEST_PATH_IMAGE015
Representing, normalizing the gradient map
Figure 28662DEST_PATH_IMAGE016
Representation, line binary diagram
Figure 169793DEST_PATH_IMAGE017
Figure 136612DEST_PATH_IMAGE018
The calculation formula (2) is shown as (7); in the region of the salient region(s),take a value of
Figure 422679DEST_PATH_IMAGE019
(Algorithm of the invention
Figure 898660DEST_PATH_IMAGE020
) (ii) a At the edge lines of the non-salient regionsTake a value of
Figure 256009DEST_PATH_IMAGE021
(Algorithm of the invention
Figure 162785DEST_PATH_IMAGE022
) (ii) a Taking values in other regions as a normalized gradient map
Figure 848982DEST_PATH_IMAGE023
(Algorithm of the invention
Figure 915027DEST_PATH_IMAGE024
);
(7)
Equation (7) can also be calculated by the maximum value equation, as shown in equation (8),
Figure 254839DEST_PATH_IMAGE026
(8)
weight coefficient
Figure 947989DEST_PATH_IMAGE027
The step S1 includes:
gradient calculation:
firstly, converting an RGB color image into a gray-scale image, and then calculating the gradient of an input image by using a Sobel operator; let Wh be the horizontal 3 × 3 operator and Wv be the vertical 3 × 3 operator, see equation (1):
Figure 868540DEST_PATH_IMAGE001
(1)
gradient of gradient image
Figure 356153DEST_PATH_IMAGE002
Formula (2) in the formula (2):
(2)
because of the large computational overhead of the square and square root, the present invention approximates the magnitude of the gradient by the sum of absolute values, see equation (3):
Figure 593416DEST_PATH_IMAGE004
(3);
and (3) calculating a saliency map:
for medical images, a significant part is obtained by using a combination of morphological methods, or an interested organ region is segmented by using a region growing, probability or statistical model set and graph cutting method;
edge line calculation:
computing edge lines in an image using Canny edge detection, smoothing the input image with a gaussian filter having a standard deviation of 2 to reduce noise, and using Canny edge detection to calculate the edge lines in the image
Figure 368474DEST_PATH_IMAGE005
Which represents the input image, is,representing a Gaussian function, smoothed images
Figure 287331DEST_PATH_IMAGE007
The calculation formula is as the formula (4):
Figure 119021DEST_PATH_IMAGE008
(4)
wherein denotes convolution; calculating the amplitude and direction of each point, the calculation formulas of the amplitude and direction are respectively shown as (5) and (6),
Figure 623952DEST_PATH_IMAGE009
(5)
Figure 843580DEST_PATH_IMAGE010
(6)
wherein:
Figure 699541DEST_PATH_IMAGE011
(ii) a Refining edges by non-maximum suppression, i.e. only retaining the point with maximum amplitude in the direction of edge normal, the amplitude of a certain point
Figure 397238DEST_PATH_IMAGE012
Less than near the edge normal
Figure 756676DEST_PATH_IMAGE013
Magnitude of 2 neighbors of direction
Figure 147206DEST_PATH_IMAGE012
Setting the value of the point to zero, namely, the point is used as a non-maximum edge point to be suppressed; detecting and connecting edges by using double-threshold processing and connection analysis, directly taking the points higher than a high threshold as an edge P, directly discarding the points lower than a low threshold, and processing the points between the two thresholds, connecting to the P by using an 8-way method, and discarding the points which cannot be connected to the P;
and (3) weighted generation of an importance graph:
calculating an importance map by weighting the calculated gradient map, saliency map, and edge line map, and using the importance map
Figure 224883DEST_PATH_IMAGE014
Shows the degree of significance of twoFor value graphs
Figure 490386DEST_PATH_IMAGE015
Representing, normalizing the gradient map
Figure 766647DEST_PATH_IMAGE016
Representation, line binary diagram
Figure 203444DEST_PATH_IMAGE017
The calculation formula (2) is shown as (7); in the region of the salient region(s),take a value of
Figure 266581DEST_PATH_IMAGE019
(Algorithm of the invention) (ii) a At the edge lines of the non-salient regions
Figure 51183DEST_PATH_IMAGE018
Take a value of
Figure 769741DEST_PATH_IMAGE021
(Algorithm of the invention) (ii) a Taking values in other regions as a normalized gradient map
Figure 425292DEST_PATH_IMAGE023
(Algorithm of the invention
Figure 964858DEST_PATH_IMAGE024
);
Figure 611740DEST_PATH_IMAGE025
(7)
Equation (7) can also be calculated by the maximum value equation, as shown in equation (8),
Figure 920361DEST_PATH_IMAGE026
(8)
weight coefficient
Figure 728917DEST_PATH_IMAGE027
The step S2 includes:
the lines of the non-salient region are regarded as object objects by a binary image boundary tracking method, and the edge lines of the non-salient region are changed into an ordered point coordinate sequence, so that the processing of a subsequent method is facilitated. Let NL be the number of points of the lth curve in the image, and the line with NL not higher than the threshold NT (NT =20 in the present invention) is not easy to attract the attention of human vision, and the image adaptation effect is small, and the energy of these points is directly set as low energy, as shown in formula (7):
Figure 490200DEST_PATH_IMAGE028
,s.t. N≤20 (9)
lines with points greater than NT (NT =20 according to the present invention) in the image are generally true edges, which are easily noticed by human vision and need further optimization.
Calculating the slope of the edge:
and (5) setting the coordinate of the point a on the edge line L as (Xa, Ya), a is more than or equal to 1 and less than or equal to NL, and the NL is the number of the points on the edge line L at present.
The first order difference at a certain point is the slope of the tangent line at a certain point on the curve, and reflects the steepness of the tangent line at the point.
The slope at point (Xa, Ya) is defined as:
Figure 940773DEST_PATH_IMAGE029
, s.t. 2≤a≤NL-1 (10)
because the image pixels are discrete points, the slope and the curvature are directly calculated by using 1 or 2 adjacent points, the jump of the result can be caused, and the integral trend of the line cannot be reflected, so that the calculation is carried out by adopting a multi-point weighting method in the process of calculating the slope and the curvature; taking adjacent 2 x i points (i =3) of a certain edge point to calculate the slope of a tangent line of the curve; the slope definition based on the multipoint weighting at point (Xa, Ya) is shown in equation (9):
, s.t. 4≤a≤NL-3 (11)
the weight coefficients w1=3, w2=2, w3= 1;
calculate the curvature of the edge:
the curvature reflects the degree of curvature of the curve at that point
Figure 21042DEST_PATH_IMAGE031
Curvature of a point on a curve
Figure 175947DEST_PATH_IMAGE032
Express, curvature
Figure 39997DEST_PATH_IMAGE032
Is shown as (12), wherein
Figure 916686DEST_PATH_IMAGE033
Is the differential of the change in the angle of the tangent to the curve,is the differential of the curve arc length;
(12)
based on the slope calculated in the above steps, the included angle between the tangent line at the point (Xa, Ya) and the horizontal line can be calculated by using the arctangent function
Figure 595295DEST_PATH_IMAGE036
Can be used as the tangent angle change differential in the curvature calculation formula (12); the Euclidean distance between the two points (Xa +1, Ya +1) and (Xa, Ya) can be used as the differential of the arc length in the curvature calculation formula; thus, the curvature approximation calculation formula (13) can be obtained:
Figure 467436DEST_PATH_IMAGE037
, s.t. 4≤a≤NL-4 (13)
wherein the content of the first and second substances,the curvature of the a-th point on the curve L, (Xa +1, Ya +1) and (Xa, Ya) are the coordinates of the a +1 point and the adjacent point a on the curve L respectively;
since the value range of the arctangent function is
Figure 778911DEST_PATH_IMAGE039
When the edge is excessively bent, the difference of the slope arctangent functions cannot correctly reflect the included angle of the two tangent lines
Figure 47082DEST_PATH_IMAGE040
The invention uses the cosine theorem to calculate the included angle between the connecting line of the coordinate weighting (Xa-, Ya-) of the current point (Xa, Ya) and the coordinate weighting (Xa +, Ya +) of the point i (i =3) before the nearest neighborAnd use of
Figure 797049DEST_PATH_IMAGE041
It is determined whether the line at that point is excessively curved,
Figure 632150DEST_PATH_IMAGE041
the calculation formula (2) is shown as (10):
Figure 641694DEST_PATH_IMAGE042
(14)
wherein the content of the first and second substances,
Figure 613061DEST_PATH_IMAGE043
Figure 682648DEST_PATH_IMAGE044
Figure 972422DEST_PATH_IMAGE045
,
Figure 785657DEST_PATH_IMAGE046
the weight coefficients w1=3, w2=2, w3= 1;
in the next step, it will use
Figure 877110DEST_PATH_IMAGE041
As a basis for judging whether the curve is excessively bent;
energy optimization:
in the process of calculating the column seams (the same as the row seams), for a point on the line, the curvature (the bending degree) of which is greater than the threshold value, the curve bending degree near the point is greater, if the engraving line passes through the point, the point is deleted, so that the line is interrupted and jumps violently, the higher energy of the point is kept unchanged, the engraving seam is blocked from passing through the area of the line, and the generation of larger visual distortion is prevented;
step S21: the invention judges whether the curve is excessively bent or not by utilizing the curvature of the line, and when the curve is excessively bent, the invention can judge whether the curve is excessively bent or not
Figure 852019DEST_PATH_IMAGE047
Greater than a threshold valueAnd is andless than threshold
Figure 959018DEST_PATH_IMAGE049
When the energy is consumed, the curve at the point is bent to a greater degree, and higher energy is kept unchanged; otherwise, the curve bending degree near the point is small, and the step 2 is carried out;
step S22: if the absolute value of the slope
Figure 104829DEST_PATH_IMAGE050
Is greater than
Figure 637704DEST_PATH_IMAGE051
When the crack passes through the point, the line at the point is steep, and if the crack passes through the point, the point can be deleted after the crack deletion operation is carried out, so that the large visual distortion can be caused, and the high energy of the point is kept unchanged; otherwise, the bending degree of the line near the point is small and relatively gentle, and the step S23 is carried out;
step S23: the step of judging the smoothness of the vicinity of the point according to the slope if the absolute value of the slope
Figure 792742DEST_PATH_IMAGE050
Is greater than
Figure 265311DEST_PATH_IMAGE052
Will cycle
Figure 972236DEST_PATH_IMAGE053
Setting to 3, and then turning to step 4 to optimize the edge energy; otherwise, setting the period to be 2 to optimize the edge energy, and turning to the step 4 to optimize the energy;
step S24:
this step optimizes the energy by adjusting the energy level of the points, periodically at intervals
Figure 366308DEST_PATH_IMAGE053
Reducing the energy of the point (Xa, Ya) on the line and the energy of the adjacent points (Xa +1, Ya) and (Xa-1, Ya), see formula (15), optimizing the energy of the previous point to beThe energy of the optimized point is(ii) a Slope of tangent line
Figure 592256DEST_PATH_IMAGE056
The smaller the area the more gradual the line is, the greater the degree of energy reduction is when
Figure 208045DEST_PATH_IMAGE057
When the energy is reduced to 1/2, when the energy is reduced to the original energy
Figure 333914DEST_PATH_IMAGE058
When the energy is not changed, the calculation formula is shown as (15):
Figure 249917DEST_PATH_IMAGE059
(15)
in order to keep the shape integrity and continuity of the line, the energy on the line is periodically and uniformly fluctuated, and the cutting line is guided to uniformly penetrate through the line; period of points in which energy is optimized
Figure 298645DEST_PATH_IMAGE060
Changing with the change of the tangent slope of the point, specifically as shown in equation (16), where a is the serial number of the point in the edge point sequence; when the absolute K is less than 1/2 and is more than or equal to 0,
Figure 667309DEST_PATH_IMAGE060
=2, i.e. the line is optimized at 1 point every 1 point; when 1/2 ≦ K<When the pressure of the mixture is 1, the pressure is lower,=3, i.e. the line is optimized at 1 point every 2 points;
Figure 331826DEST_PATH_IMAGE061
{ if a mod λ equals 0 } , s.t. 4≤a≤NL-4,(16)。
the step S3 includes:
the principle of line cutting is to find a path with 8 fields connected in the horizontal direction or the vertical direction, and the accumulated energy of pixels on the path is minimum
Figure DEST_PATH_IMAGE066
. Backward line clipping considers only the energy and energy difference with the pixel itself
Figure 20296DEST_PATH_IMAGE066
Cutting method of forward line
Figure DEST_PATH_IMAGE068
Considering the energy increased by deleting the pixel lines to enable the new adjacent relation generated by other pixels, compared with the backward line cutting method, the forward line cutting method has better adaptation effect and generates obviously less artificial noise. The line cutting steps of the method all adopt forward line cutting operation to realize image adaptation.
Since the principle of horizontal and vertical line cropping is the same, the process is similar, and the main implementation process of the line cropping algorithm will be described below by taking the vertical cropping line as an example.
Assuming that the energy of a point (i, j) in the normalized energy map is e (i, j), the backward cumulative energy M (i, j) is the minimum value of the energy of the point and the cumulative energy of 3 points adjacent to the point in the previous row, and the calculation formula of M (i, j) is as follows:
Figure 440039DEST_PATH_IMAGE062
, (17)
the forward accumulated energy is obtained by adding a term C (i, j) on the basis of backward energy, wherein the term C (i, j) is energy generated when a non-adjacent pixel point becomes a new adjacent pixel point after the pixel is deleted, and a calculation formula of the term C (i, j) is as follows:
Figure 403315DEST_PATH_IMAGE063
(18)
Figure 28332DEST_PATH_IMAGE064
(19)
the process of calculating the accumulated energy is that each point of the line is traversed from the second line, the energy of the point and the minimum value of the accumulated energy of the 8 field points in the previous line are added to be taken as the accumulated energy of the point, and the accumulated energy is calculated to the last line; after the accumulated energy is calculated, backtracking is carried out from the last line by using a dynamic programming method, and the pixel point with the minimum accumulated energy in the 8 fields in the adjacent last line is searched until the first line.
The experimental results are as follows:
results of objective evaluation as shown in table 2, the average scores of the inventive algorithm for the MSRA10K dataset 2 images were 8.1 and 7.9, respectively, higher than the other methods. The algorithm provided by the invention is more in line with the visual requirements of human beings, and the feeling effect is better.
TABLE 2 scoring results
Figure DEST_PATH_IMAGE070
Because the importance of the image content is not considered by Scaling and Cropping, unified Scaling is implemented, the efficiency is higher, and real-time Scaling can be realized. The execution time of the other three methods is shown in table 3, the least used method is underlined, and it can be seen that the conventional line cutting method (SC) is less used in the experiment, and the 2 methods of the present invention are equivalent to each other.
TABLE 3 comparison of time consumption (unit: second) for three line cutting methods
Figure DEST_PATH_IMAGE072
The reason for this is that SC only calculates gradient map and uses the gradient map as energy map, so it is less time consuming; the SC-IE increases the processes of significance detection and edge detection calculation, and the time consumption is more; the SC-RE time is the most, because the method adds an energy optimization link on the basis of SC-IE, the optimization is carried out aiming at the line area, the time consumption is increased, and the detailed time consumption situation is shown in the table 4.
TABLE 4 detailed decomposition Table (unit: second) for time consumption of SC-RE method of the present invention
Figure DEST_PATH_IMAGE074
The invention utilizes the characteristic that human eyes are sensitive to contours and lines, and optimizes the energy at edges and lines by improving the traditional line cutting method. Firstly, improving the energy at the line part and reducing the possibility that a cutting seam passes through a contour line area; then, according to the line inclination degree and the bending degree of the unimportant area, the energy of the line area is set to be periodic zigzag, so that the cutting seam is more uniform when passing through the edge and the line, the distortion of the cut edge and the line area is reduced, the interruption and the dislocation are avoided, the visual difference with an original image before zooming is reduced, the visual distortion is reduced, and the zooming quality is finally improved.
The invention provides a high-quality image adaptation method, which focuses on the maintenance of the outline of an unimportant area on the premise of maintaining the image adaptation quality of a salient area, reduces the distortion and discontinuity of the image outline, reduces artificial noise and improves the image display quality.
The algorithm of the invention optimizes the adaptation effect of the non-significant region on the basis of keeping the shape of the significant region.
The image adaptive line cutting method based on the contour preservation innovatively combines the gradient map, the saliency map and the edge map into the importance map in a weighted mode, and simultaneously optimizes the energy of an insignificant region according to the slope and the curvature of a contour line, so that the cutting line can more uniformly penetrate through the contour line region, and the distortion and the interruption of the contour are reduced. By applying the method, the contour integrity and continuity of the non-salient region can be maintained on the premise of protecting the visual effect of the salient region of the image, and the display quality of the non-salient region and the whole image is further improved. Experiments performed on published data sets show that the method proposed by the present invention is significantly better than the comparative method.
On the premise of ensuring the display effect of the remarkable object, the invention improves the continuity and fluency of the edge and the outline by keeping the shape of the edge outline of the non-remarkable area. Experiments show that the method provided by the invention can reduce visual deviation, improve the display effect of the non-significant area and finally improve the overall display effect of the image.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (4)

1. An image adaptation method based on edge vision protection is characterized by comprising the following steps
The method comprises the following steps:
s1, calculating an importance map: calculating a weighted importance map according to the gradient map, the saliency map and the edge line;
s2, optimizing an energy map: taking the importance map as a basic energy map, and optimizing the energy map by using the visual characteristics of the inclination degree and the bending degree of the lines on the basis;
s3, line cutting adaptation: and performing forward line cutting operation according to the optimized energy diagram to realize image adaptation.
2. The image adaptation method based on edge vision protection as claimed in claim 1, wherein the image adaptation method is characterized in that
In that, the step S1 includes:
gradient calculation:
firstly, converting an RGB color image into a gray-scale image, and then calculating the gradient of an input image by using a Sobel operator; let Wh be the horizontal 3 × 3 operator and Wv be the vertical 3 × 3 operator, see equation (1):
Figure DEST_PATH_IMAGE001
(1)
gradient of gradient image
Figure 493992DEST_PATH_IMAGE002
Formula (2) in the formula (2):
Figure DEST_PATH_IMAGE003
(2)
because of the large computational overhead of the square and square root, the present invention approximates the magnitude of the gradient by the sum of absolute values, see equation (3):
Figure 903107DEST_PATH_IMAGE004
(3);
and (3) calculating a saliency map:
for medical images, a significant part is obtained by using a combination of morphological methods, or an interested organ region is segmented by using a region growing, probability or statistical model set and graph cutting method;
edge line calculation:
computing edge lines in an image using Canny edge detection, smoothing the input image with a gaussian filter having a standard deviation of 2 to reduce noise, and using Canny edge detection to calculate the edge lines in the imageWhich represents the input image, is,
Figure 862711DEST_PATH_IMAGE006
representing a Gaussian function, smoothed imagesThe calculation formula is as the formula (4):
(4)
wherein denotes convolution; calculating the amplitude and direction of each point, the calculation formulas of the amplitude and direction are respectively shown as (5) and (6),
Figure DEST_PATH_IMAGE009
(5)
Figure 675126DEST_PATH_IMAGE010
(6)
wherein:
Figure DEST_PATH_IMAGE011
(ii) a Refining edges by non-maximum suppression, i.e. only retaining the point with maximum amplitude in the direction of edge normal, the amplitude of a certain point
Figure 215829DEST_PATH_IMAGE012
Less than near the edge normalMagnitude of 2 neighbors of directionSetting the value of the point to zero, namely, the point is used as a non-maximum edge point to be suppressed; detecting and connecting edges by using double-threshold processing and connection analysis, directly taking the points higher than a high threshold as an edge P, directly discarding the points lower than a low threshold, and processing the points between the two thresholds, connecting to the P by using an 8-way method, and discarding the points which cannot be connected to the P;
and (3) weighted generation of an importance graph:
calculating an importance map by weighting the calculated gradient map, saliency map, and edge line map, and using the importance map
Figure 308648DEST_PATH_IMAGE014
For representing, significance binary maps
Figure DEST_PATH_IMAGE015
Representing, normalizing the gradient map
Figure 975252DEST_PATH_IMAGE016
Representation, line binary diagram
Figure DEST_PATH_IMAGE017
The calculation formula (2) is shown as (7); in the region of the salient region(s),
Figure 224148DEST_PATH_IMAGE018
take a value of
Figure DEST_PATH_IMAGE019
(Algorithm of the invention
Figure 933478DEST_PATH_IMAGE020
) (ii) a At the edge lines of the non-salient regions
Figure 680854DEST_PATH_IMAGE018
Take a value of
Figure DEST_PATH_IMAGE021
(Algorithm of the invention
Figure 202840DEST_PATH_IMAGE022
) (ii) a Taking values in other regions as a normalized gradient map
Figure DEST_PATH_IMAGE023
(Algorithm of the invention
Figure 493007DEST_PATH_IMAGE024
);
(7)
Equation (7) can also be calculated by the maximum value equation, as shown in equation (8),
Figure 638818DEST_PATH_IMAGE026
(8)
weight coefficient
Figure 607911DEST_PATH_IMAGE028
3. The image adaptation method based on edge vision protection as claimed in claim 1, wherein the image adaptation method is characterized in that
In that, the step S2 includes:
the lines of the non-salient region are regarded as object objects by a binary image boundary tracking method, and the edge lines of the non-salient region are changed into an ordered point coordinate sequence so as to facilitate the processing of a subsequent method; let NL be the number of points of the lth curve in the image, and the line with NL not higher than the threshold NT (NT =20 in the present invention) is not easy to attract the attention of human vision, and the image adaptation effect is small, and the energy of these points is directly set as low energy, as shown in formula (7):
Figure DEST_PATH_IMAGE029
,s.t. N≤20 (9)
calculating the slope of the edge:
setting the coordinate of the point a on the edge line of the L-th edge as (Xa, Ya), a is more than or equal to 1 and less than or equal to NL, and the NL is the number of the points on the edge line of the L-th edge at present;
the first-order difference at a certain point is the slope of a tangent line at a certain point on the curve, and the steepness of the tangent line at the point is reflected;
the slope at point (Xa, Ya) is defined as:
Figure 169473DEST_PATH_IMAGE030
, s.t. 2≤a≤NL-1 (10)
because the image pixels are discrete points, the slope and the curvature are directly calculated by using 1 or 2 adjacent points, the jump of the result can be caused, and the integral trend of the line cannot be reflected, so that the calculation is carried out by adopting a multi-point weighting method in the process of calculating the slope and the curvature; taking adjacent 2 x i points (i =3) of a certain edge point to calculate the slope of a tangent line of the curve; the slope definition based on the multipoint weighting at point (Xa, Ya) is shown in equation (9):
Figure DEST_PATH_IMAGE031
, s.t. 4≤a≤NL-3 (11)
the weight coefficients w1=3, w2=2, w3= 1;
calculate the curvature of the edge:
the curvature reflects the degree of curvature of the curve at that point
Figure 78261DEST_PATH_IMAGE032
Curvature of a point on a curve
Figure DEST_PATH_IMAGE033
Express, curvature
Figure 394973DEST_PATH_IMAGE033
Is shown as (12), whereinIs the differential of the change in the angle of the tangent to the curve,
Figure DEST_PATH_IMAGE035
is the differential of the curve arc length;
Figure 216616DEST_PATH_IMAGE036
(12)
based on the slope calculated in the above steps, the included angle between the tangent line at the point (Xa, Ya) and the horizontal line can be calculated by using the arctangent functionCan be used asThe tangent angle change differential in the curvature calculation formula (12); the Euclidean distance between the two points (Xa +1, Ya +1) and (Xa, Ya) can be used as the differential of the arc length in the curvature calculation formula; thus, the curvature approximation calculation formula (13) can be obtained:
Figure 481375DEST_PATH_IMAGE038
, s.t. 4≤a≤NL-4 (13)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE039
the curvature of the a-th point on the curve L, (Xa +1, Ya +1) and (Xa, Ya) are the coordinates of the a +1 point and the adjacent point a on the curve L respectively;
since the value range of the arctangent function is
Figure 201944DEST_PATH_IMAGE040
When the edge is excessively bent, the difference of the slope arctangent functions cannot correctly reflect the included angle of the two tangent lines
Figure DEST_PATH_IMAGE041
The invention uses the cosine theorem to calculate the included angle between the connecting line of the coordinate weighting (Xa-, Ya-) of the current point (Xa, Ya) and the coordinate weighting (Xa +, Ya +) of the point i (i =3) before the nearest neighbor
Figure 880050DEST_PATH_IMAGE042
And use of
Figure 111311DEST_PATH_IMAGE042
It is determined whether the line at that point is excessively curved,
Figure 27315DEST_PATH_IMAGE042
the calculation formula (2) is shown as (10):
(14)
wherein the content of the first and second substances,
Figure 154671DEST_PATH_IMAGE044
Figure DEST_PATH_IMAGE045
Figure 257756DEST_PATH_IMAGE046
,
Figure DEST_PATH_IMAGE047
the weight coefficients w1=3, w2=2, w3= 1;
in the next step, it will use
Figure 480925DEST_PATH_IMAGE042
As a basis for judging whether the curve is excessively bent;
energy optimization:
in the process of calculating the column seams (the same as the row seams), for a point on the line, the curvature (the bending degree) of which is greater than the threshold value, the curve bending degree near the point is greater, if the engraving line passes through the point, the point is deleted, so that the line is interrupted and jumps violently, the higher energy of the point is kept unchanged, the engraving seam is blocked from passing through the area of the line, and the generation of larger visual distortion is prevented;
step S21: the invention judges whether the curve is excessively bent or not by utilizing the curvature of the line, and when the curve is excessively bent, the invention can judge whether the curve is excessively bent or not
Figure 313752DEST_PATH_IMAGE048
Greater than a threshold value
Figure DEST_PATH_IMAGE049
And is andless than threshold
Figure 671232DEST_PATH_IMAGE050
When the energy is consumed, the curve at the point is bent to a greater degree, and higher energy is kept unchanged; otherwise, the curve bending degree near the point is small, and the step 2 is carried out;
step S22: if the absolute value of the slope
Figure DEST_PATH_IMAGE051
Is greater than
Figure 306613DEST_PATH_IMAGE052
When the crack passes through the point, the line at the point is steep, and if the crack passes through the point, the point can be deleted after the crack deletion operation is carried out, so that the large visual distortion can be caused, and the high energy of the point is kept unchanged; otherwise, the bending degree of the line near the point is small and relatively gentle, and the step S23 is carried out;
step S23: the step of judging the smoothness of the vicinity of the point according to the slope if the absolute value of the slope
Figure 603733DEST_PATH_IMAGE051
Is greater than
Figure DEST_PATH_IMAGE053
Will cycle
Figure 102585DEST_PATH_IMAGE054
Setting to 3, and then turning to step 4 to optimize the edge energy; otherwise, setting the period to be 2 to optimize the edge energy, and turning to the step 4 to optimize the energy;
step S24:
this step optimizes the energy by adjusting the energy level of the points, periodically at intervals
Figure 242580DEST_PATH_IMAGE054
Reducing the energy of the point (Xa, Ya) on the line and the energy of the upper and lower adjacent points (Xa +1, Ya) and (Xa-1, Ya)Quantity, see equation (15), the energy of the optimization front point isThe energy of the optimized point is
Figure 557017DEST_PATH_IMAGE056
(ii) a Slope of tangent line
Figure DEST_PATH_IMAGE057
The smaller the area the more gradual the line is, the greater the degree of energy reduction is when
Figure 505382DEST_PATH_IMAGE058
When the energy is reduced to 1/2, when the energy is reduced to the original energy
Figure DEST_PATH_IMAGE059
When the energy is not changed, the calculation formula is shown as (15):
Figure 676600DEST_PATH_IMAGE060
(15)
in order to keep the shape integrity and continuity of the line, the energy on the line is periodically and uniformly fluctuated, and the cutting line is guided to uniformly penetrate through the line; period of points in which energy is optimized
Figure DEST_PATH_IMAGE061
Changing with the change of the tangent slope of the point, specifically as shown in equation (16), where a is the serial number of the point in the edge point sequence; when the absolute K is less than 1/2 and is more than or equal to 0,=2, i.e. the line is optimized at 1 point every 1 point; when 1/2 ≦ K<When the pressure of the mixture is 1, the pressure is lower,
Figure 451713DEST_PATH_IMAGE061
=3, i.e. the line is optimized at 1 point every 2 points;
Figure 723425DEST_PATH_IMAGE062
{ if a mod λ equals 0 } , s.t. 4≤a≤NL-4, (16)。
4. the image adaptation method based on edge vision protection as claimed in claim 1, wherein the image adaptation method is characterized in that
In that, the step S3 includes:
assuming that the energy of a point (i, j) in the normalized energy map is e (i, j), the backward cumulative energy M (i, j) is the minimum value of the energy of the point and the cumulative energy of 3 points adjacent to the point in the previous row, and the calculation formula of M (i, j) is as follows:
Figure DEST_PATH_IMAGE063
, (17)
the forward accumulated energy is obtained by adding a term C (i, j) on the basis of backward energy, wherein the term C (i, j) is energy generated when a non-adjacent pixel point becomes a new adjacent pixel point after the pixel is deleted, and a calculation formula of the term C (i, j) is as follows:
Figure 127862DEST_PATH_IMAGE064
(18)
Figure DEST_PATH_IMAGE065
(19)
the process of calculating the accumulated energy is that each point of the line is traversed from the second line, the energy of the point and the minimum value of the accumulated energy of the 8 field points in the previous line are added to be taken as the accumulated energy of the point, and the accumulated energy is calculated to the last line; after the accumulated energy is calculated, backtracking is carried out from the last line by using a dynamic programming method, and the pixel point with the minimum accumulated energy in the 8 fields in the adjacent last line is searched until the first line.
CN201911122160.2A 2019-11-15 2019-11-15 Image adaptation method based on edge vision protection Active CN110853084B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911122160.2A CN110853084B (en) 2019-11-15 2019-11-15 Image adaptation method based on edge vision protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911122160.2A CN110853084B (en) 2019-11-15 2019-11-15 Image adaptation method based on edge vision protection

Publications (2)

Publication Number Publication Date
CN110853084A true CN110853084A (en) 2020-02-28
CN110853084B CN110853084B (en) 2022-07-08

Family

ID=69600791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911122160.2A Active CN110853084B (en) 2019-11-15 2019-11-15 Image adaptation method based on edge vision protection

Country Status (1)

Country Link
CN (1) CN110853084B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782313A (en) * 2020-05-15 2020-10-16 北京完美知识科技有限公司 Display method, device, equipment, system and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150302551A1 (en) * 2012-12-17 2015-10-22 Intel Corporation Content aware video resizing
CN105488758A (en) * 2015-11-30 2016-04-13 河北工业大学 Image scaling method based on content awareness
CN109447970A (en) * 2018-10-30 2019-03-08 河北工业大学 The image reorientation method based on energy transfer and uniformly scaled

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150302551A1 (en) * 2012-12-17 2015-10-22 Intel Corporation Content aware video resizing
CN105488758A (en) * 2015-11-30 2016-04-13 河北工业大学 Image scaling method based on content awareness
CN109447970A (en) * 2018-10-30 2019-03-08 河北工业大学 The image reorientation method based on energy transfer and uniformly scaled

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YAN ZHANG ET AL.: "Hybrid image retargeting using optimized seam carving and scaling", 《MULTIMED TOOLS APPL》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782313A (en) * 2020-05-15 2020-10-16 北京完美知识科技有限公司 Display method, device, equipment, system and storage medium
CN111782313B (en) * 2020-05-15 2023-10-20 北京完美知识科技有限公司 Display method, device, equipment, system and storage medium

Also Published As

Publication number Publication date
CN110853084B (en) 2022-07-08

Similar Documents

Publication Publication Date Title
CN107808156B (en) Region-of-interest extraction method
CN101105862A (en) Medical image window parameter self-adaptive regulation method
CN103034973B (en) Based on the adaptive image scaling method of bicubic interpolation
CN107767349B (en) A kind of method of Image Warping enhancing
CN110211058A (en) A kind of data enhancement methods of medical image
CN107895345A (en) A kind of method and apparatus for improving facial image resolution ratio
CN1919144A (en) Ultrasonic image enhancement and spot inhibition method
JP5158202B2 (en) Image correction apparatus and image correction method
CN112509003B (en) Method and system for solving target tracking frame drift
JP5709216B2 (en) Image processing program, method and apparatus
CN107330860A (en) A kind of rational interpolation Zoom method based on CT image borders
JP2008167027A (en) Image processor, image processing method and image processing program
CN115908410B (en) Pressure vessel laser welding control method based on machine vision
JP5362130B2 (en) Image processing method and apparatus therefor
CN110853084B (en) Image adaptation method based on edge vision protection
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN104182965B (en) The method of chest muscle is split in a kind of galactophore image
Liu et al. Contour-maintaining-based image adaption for an efficient ambulance service in intelligent transportation systems
CN108510478A (en) Lung airway image partition method, terminal, storage medium
CN113689337B (en) Ultrasonic image super-resolution reconstruction method and system based on generation countermeasure network
CN110163825A (en) A kind of denoising of human embryos cardiac ultrasound images and Enhancement Method
JP3730872B2 (en) Image processing apparatus and image processing program
CN112070669A (en) Super-resolution image reconstruction method for any fuzzy kernel
CN108447066B (en) Biliary tract image segmentation method, terminal and storage medium
CN117315735A (en) Face super-resolution reconstruction method based on priori information and attention mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant