CN108389155B - Image processing method and device and electronic equipment - Google Patents

Image processing method and device and electronic equipment Download PDF

Info

Publication number
CN108389155B
CN108389155B CN201810229530.1A CN201810229530A CN108389155B CN 108389155 B CN108389155 B CN 108389155B CN 201810229530 A CN201810229530 A CN 201810229530A CN 108389155 B CN108389155 B CN 108389155B
Authority
CN
China
Prior art keywords
pixel point
target
ellipse
determining
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810229530.1A
Other languages
Chinese (zh)
Other versions
CN108389155A (en
Inventor
李艳杰
眭一帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201810229530.1A priority Critical patent/CN108389155B/en
Publication of CN108389155A publication Critical patent/CN108389155A/en
Application granted granted Critical
Publication of CN108389155B publication Critical patent/CN108389155B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/04
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4023Decimation- or insertion-based scaling, e.g. pixel or line decimation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity

Abstract

The invention discloses an image processing method, an image processing device and electronic equipment, wherein the method comprises the following steps: detecting a plurality of key points corresponding to a processing object in the image, and determining a region to be processed corresponding to the processing object according to the plurality of key points; aiming at each pixel point in the image, judging whether the pixel point belongs to a region to be processed; and determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information. Therefore, all pixel points in the image do not need to be processed in the mode, only the pixel points in the region to be processed are subjected to translation processing, the calculation amount can be reduced, the image processing speed is increased, and the effect of translating the processing object in the image can be achieved.

Description

Image processing method and device and electronic equipment
Technical Field
The invention relates to the technical field of image processing, in particular to an image processing method and device and electronic equipment.
Background
With the development of computer image processing technology, image beautification becomes more and more simple and popular, and the image beautification technology is applied to scenes such as image post-processing, video live broadcast, video recording and the like, for example, micro-shaping special effect processing, which is to process only pixel points in partial areas in an image, and for example, in application scenes of micro-shaping special effect processing on a face image, including technical means of face thinning, eye enlarging, nose bridge lifting, nose wing reducing and the like, interestingness and image aesthetic feeling can be improved, so that the micro-shaping special effect processing technology in image beautification is more widely concerned and favored by people.
However, in the process of implementing the present invention, the inventors found that the above-mentioned manner in the prior art has at least the following problems: the micro-shaping special effect processing often processes all pixel points in an image without considering an actual deformation area, so that some pixel points which do not need to be processed are also processed, a large amount of extra calculation is increased, the real-time performance of the micro-shaping special effect processing is reduced, and in addition, no method for performing translation processing on a processing object in the image exists in the prior art. In summary, there is no technical solution that can solve the above problems well in the prior art.
Disclosure of Invention
The present invention has been made in view of the above problems, and has an object to provide an image processing method, apparatus, and electronic device that overcome the above problems or at least partially solve the above problems.
According to an aspect of the present invention, there is provided an image processing method including: detecting a plurality of key points corresponding to a processing object in an image, and determining a region to be processed corresponding to the processing object according to the plurality of key points; aiming at each pixel point in the image, judging whether the pixel point belongs to the area to be processed; if so, determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information.
Optionally, the to-be-processed region is an elliptical to-be-processed region, and the elliptical to-be-processed region is determined by an ellipse center, an ellipse transverse axis and an ellipse longitudinal axis.
Optionally, the step of determining, for each pixel point in the image, whether the pixel point belongs to the to-be-processed region specifically includes:
aiming at each pixel point in the image, determining an original abscissa value and an original ordinate value of the pixel point;
carrying out scaling processing on the original abscissa value of the pixel point according to the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a scaled abscissa value of the pixel point;
calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by using the original ordinate value and the scaled abscissa value of the pixel point;
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse; if not, determining that the pixel point belongs to the area to be processed.
Optionally, the step of determining, for each pixel point in the image, an original abscissa value and an original ordinate value of the pixel point specifically includes:
determining the center of the ellipse as the origin of a target coordinate system in advance, determining a target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining a target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis;
and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
Optionally, the area to be treated further includes: a target processing region and an environmental processing region; the step of determining the region to be processed corresponding to the processing object according to the plurality of key points specifically includes:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object;
determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
and the step of determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed specifically includes:
when the pixel point is determined to belong to the target processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset target processing rule;
and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
Optionally, the target processing region is an elliptical target processing region, and the environment processing region is an elliptical ring-shaped environment processing region located at the periphery of the elliptical target processing region;
the step of determining a target area shape and a target area range of a target processing area corresponding to the processing object, which are included in the image, according to the contour and/or the shape of the processing object specifically includes:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the step of determining an environment region shape and an environment region range of an environment processing region located at the periphery of the target processing region according to the target region shape and the target region range of the target processing region specifically includes:
determining the target circle center as the ellipse circle center, determining the ellipse transverse axis and/or the ellipse longitudinal axis according to the target transverse axis and/or the target longitudinal axis, and determining the ellipse to-be-processed area according to the ellipse circle center, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
Optionally, the step of determining that the pixel point belongs to the target processing region according to the distance between the pixel point and the region center of the to-be-processed region specifically includes:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
Optionally, the length of the transverse axis of the ellipse is a first preset multiple of the length of the target transverse axis, and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
Optionally, the method further comprises:
the method comprises the steps that a first mapping relation between translation distance information corresponding to a pixel point and the equivalent circumferential distance from the pixel point to the center of an ellipse is determined in advance when the equivalent circumferential distance from the pixel point to the center of the ellipse is not larger than the length of a target longitudinal axis, and a target processing rule is determined according to the first mapping relation;
and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
Optionally, the step of translating the pixel point to the target position according to the translation distance information specifically includes:
calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point;
and determining the target position according to the target coordinate value, and translating the pixel point to the target position.
Optionally, the step of determining the target position according to the target coordinate value and translating the pixel point to the target position specifically includes:
and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
Optionally, after the step of determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the to-be-processed area, the method further includes:
determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the step of translating the pixel point to the target position according to the translation distance information specifically includes: and translating the pixel point to a target position according to the translation correction information.
Optionally, the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
Optionally, the method is implemented by a graphics processor.
According to another aspect of the present invention, there is provided an image processing apparatus including: the key point detection module is suitable for detecting a plurality of key points corresponding to a processing object in the image; a to-be-processed region determining module, adapted to determine a to-be-processed region corresponding to the processing object according to the plurality of key points; the judging module is suitable for judging whether each pixel point in the image belongs to the area to be processed; the translation distance information determining module is suitable for determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed if the pixel point is located in the area to be processed; and the translation module is suitable for translating the pixel point to a target position according to the translation distance information.
Optionally, the to-be-processed region is an elliptical to-be-processed region, and the elliptical to-be-processed region is determined by an ellipse center, an ellipse transverse axis and an ellipse longitudinal axis.
Optionally, the apparatus further comprises:
the coordinate value determining module is suitable for determining an original abscissa value and an original ordinate value of each pixel point in the image;
the zooming module is suitable for zooming the original abscissa value of the pixel point according to the abscissa-ordinate ratio between the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a zoomed abscissa value of the pixel point;
the equivalent circumferential distance calculation module is suitable for calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by utilizing the original longitudinal coordinate value and the scaled horizontal coordinate value of the pixel point;
the determination module is further adapted to: judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse; if not, determining that the pixel point belongs to the area to be processed.
Optionally, the coordinate value determination module is further adapted to:
determining the center of the ellipse as the origin of a target coordinate system in advance, determining a target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining a target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis;
and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
Optionally, the area to be treated further includes: a target processing region and an environmental processing region; the pending area determination module is further adapted to:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object;
determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
and, the translation distance information determination module is further adapted to:
when the pixel point is determined to belong to the target processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset target processing rule;
and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
Optionally, the target processing region is an elliptical target processing region, and the environment processing region is an elliptical ring-shaped environment processing region located at the periphery of the elliptical target processing region;
the pending area determination module is further adapted to:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the pending area determination module is further adapted to:
determining the target circle center as the ellipse circle center, determining the ellipse transverse axis and/or the ellipse longitudinal axis according to the target transverse axis and/or the target longitudinal axis, and determining the ellipse to-be-processed area according to the ellipse circle center, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
Optionally, the determining module is further adapted to:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
Optionally, the length of the transverse axis of the ellipse is a first preset multiple of the length of the target transverse axis, and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
Optionally, the apparatus further comprises:
the mapping relation determining module is suitable for determining a first mapping relation between translation distance information corresponding to a pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is not more than the length of the target longitudinal axis in advance, and the target processing rule is determined according to the first mapping relation;
the mapping determination module is further adapted to: and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
Optionally, the translation module is further adapted to:
calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point;
and determining the target position according to the target coordinate value, and translating the pixel point to the target position.
Optionally, the translation module is further adapted to:
and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
Optionally, the apparatus further comprises:
the correction module is suitable for determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting the translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the translation module is further adapted to: and translating the pixel point to a target position according to the translation correction information.
Optionally, the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
According to still another aspect of the present invention, there is provided an electronic apparatus including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the image processing method.
According to still another aspect of the present invention, there is provided a computer storage medium having at least one executable instruction stored therein, the executable instruction causing a processor to perform operations corresponding to the image processing method as described above.
In the image processing method, the image processing device and the electronic equipment, firstly, a plurality of key points corresponding to a processing object in an image are detected, and a region to be processed corresponding to the processing object is determined according to the plurality of key points; secondly, judging whether each pixel point in the image belongs to a region to be processed or not; and finally, determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information. Therefore, all pixel points in the image do not need to be processed in the mode, only the pixel points in the region to be processed are subjected to translation processing, the calculation amount can be reduced, the image processing speed is increased, and the effect of translating the processing object in the image can be achieved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 shows a schematic flow diagram of an image processing method according to an embodiment of the invention;
FIG. 2 shows a schematic flow diagram of an image processing method according to another embodiment of the invention;
FIG. 3 is a schematic configuration diagram showing an image processing apparatus according to still another embodiment of the present invention;
FIG. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the invention;
figure 5 shows a schematic diagram of one form of translation distance look-up table.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 shows a flow diagram of an image processing method according to an embodiment of the invention. As shown in fig. 1, the method comprises the steps of:
step S110, detecting a plurality of key points corresponding to the processing object in the image, and determining a region to be processed corresponding to the processing object according to the plurality of key points.
The image may be a photograph taken by a camera or an image frame in a captured video stream, and the processing object may be a facial region, five sense organs, and the like in the image, which is not limited in the present invention. In practical applications, the processing objects in the image may be fixed in advance, that is, only a plurality of fixed processing objects are processed in the processing process, or a user may select the processing objects in the image by himself, which is not limited by the present invention.
In an application scenario of processing a face image, a processing object may be a face or a facial feature, and the key points may be feature points corresponding to facial features and/or facial contours, specifically, feature points corresponding to facial contour positions, feature points corresponding to facial features, and feature points corresponding to other facial features. The present invention is not limited to the manner of detecting the key points.
According to a plurality of detected key points corresponding to the processing object, determining a region to be processed corresponding to the processing object in the image, wherein the region shape, the region outline, the region range and the like of the region to be processed can be determined according to each key point of the processing object. For example, if it is determined that the processing target is an eye portion in the image, a region to be processed corresponding to the eye is determined from a plurality of key points of the detected eye portion.
Step S120, judging whether each pixel point in the image belongs to a region to be processed, if so, executing step S130; if not, the method ends.
According to step S120, a to-be-processed region corresponding to the processing object is determined, and a pixel point in the to-be-processed region is a pixel point that needs to be processed, and in a specific application, a coordinate system may be established in the image, a coordinate value of each pixel point in the image in the coordinate system is calculated, and whether the pixel point belongs to the to-be-processed region is determined according to the coordinate value of the pixel point, or whether the pixel point belongs to the to-be-processed region is determined according to a distance between the pixel point and the center of the to-be-processed region. The method does not limit the mode for judging whether the pixel point belongs to the area to be processed, and a person skilled in the art can set the mode according to actual needs.
Step S130, determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information.
In practical application, the micro-shaping special effect processing on the image usually only needs to process part of pixel points in the image, does not need to traverse all the pixel points in the image, and specifically determines the pixel points needing to be processed according to the divided to-be-processed area.
The center of the region to be processed may be determined according to the shape of the region to be processed, the range of the region, and a plurality of key points corresponding to the detected processing object, for example, the shape of the region to be processed is determined to be an ellipse according to a plurality of key points corresponding to the eye part, and the center key point of the eye part is determined to be the center of the region of the elliptical region to be processed.
The distance between the pixel point and the area center of the area to be processed can be calculated according to the coordinate value of the pixel point, specifically, an image coordinate system is established in the image, the coordinate value of the pixel point and the coordinate value of the area center under the image coordinate system are respectively calculated, and further the distance between the pixel point and the area center is calculated according to the coordinate values of the pixel point and the area center.
Determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center, and specifically determining translation distance information corresponding to the pixel point according to the mapping relationship between the distance between the pixel point and the area center and the translation distance information, where the translation distance information may include: translation distance and translation direction. The pixel point is further translated to a target position according to the translation distance information, specifically, a target coordinate value can be determined according to the translation distance information and the coordinate value of the pixel point, the target position corresponding to the target coordinate value is determined in the image, and the pixel value of the pixel point is assigned to the pixel point located at the target position, so that the translation processing of the pixel point is completed.
According to the image processing method provided by the embodiment, a plurality of key points corresponding to a processing object in an image are detected, and a region to be processed corresponding to the processing object is determined according to the plurality of key points; aiming at each pixel point in the image, judging whether the pixel point belongs to a region to be processed; and determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information. Therefore, all pixel points in the image do not need to be processed in the mode, only the pixel points in the region to be processed are subjected to translation processing, the calculation amount can be reduced, the image processing speed is increased, and the effect of translating the processing object in the image can be achieved.
Fig. 2 is a schematic flow chart of an image processing method according to another embodiment of the present invention, where the method of this embodiment may be implemented by a graphics processor, but of course, the method may also be implemented by other ways, which is not limited by the present invention. As shown in fig. 2, the method comprises the steps of:
step S210, detecting a plurality of key points corresponding to a processing object in the image, and determining a to-be-processed area corresponding to the processing object according to the plurality of key points, wherein the to-be-processed area is an elliptical to-be-processed area, and the elliptical to-be-processed area is determined by an elliptical center, an elliptical horizontal axis and an elliptical vertical axis.
The image may be a photo taken by a camera or an image frame in a captured video stream, which is not limited in the present invention. In practical application, the system may pre-select the processing objects in the image, that is, only fixed processing objects are processed, or may self-select the processing objects in the image according to actual needs, that is, the number and types of the processing objects may be adjusted according to user requirements, which is not limited in the present invention.
In an application scenario for processing a face image, a processing object includes: facial regions, facial contours, and/or facial features; wherein the five-sense-organ sites include at least one of: the plurality of key points corresponding to the processing object include feature points corresponding to facial regions, facial contours and/or facial features, specifically, feature points corresponding to facial contour positions, feature points corresponding to facial features, and feature points corresponding to other parts of the face. In addition, the number, distribution position, and detection method of the key points corresponding to the processing object are not limited, and any method capable of detecting the key points should be included in the scope of the present invention.
In the existing micro-shaping special effect processing technology, all pixel points in an image are often required to be processed, and each pixel point in the image is processed, but the micro-shaping special effect processing technology is to process a partial region in the image, for example, the micro-shaping special effect processing is performed on an image containing a human face, only the pixel points in the human face region are required to be processed, and the pixel points outside the human face region are not required to be processed or only subjected to adaptive adjustment.
In an application scene for processing a face image, if a region to be processed is an elliptical region to be processed, an ellipse center, an ellipse transverse axis and an ellipse longitudinal axis are determined according to key points of a processing object, and the elliptical processing region is further determined according to the ellipse center, the ellipse transverse axis and the ellipse longitudinal axis. Because the outline of the human face and the five sense organs is more approximate to an ellipse, the to-be-processed area is set to be the ellipse processing area, so that the to-be-processed area can be more attached to the outline of the human face or the five sense organs, and the to-be-processed area can contain pixel points which are not required to be processed as little as possible, so that the calculation amount is reduced, and the processing speed is increased. The horizontal axis and the vertical axis of the ellipse are perpendicular to each other, but the invention does not limit the specific orientation of the horizontal axis and the vertical axis of the ellipse, and the person skilled in the art can adjust the orientation according to the actual needs.
It should be noted that the oval to-be-processed region in this embodiment does not refer to only the oval to-be-processed region in which the length of the horizontal axis of the oval is not equal to the length of the vertical axis of the oval, but the oval to-be-processed region may also refer to a circular to-be-processed region in which the length of the horizontal axis of the oval is equal to the length of the vertical axis of the oval because the circle is also a special oval, and the specific requirement is determined according to the actual situation.
Optionally, the area to be treated further comprises: a target processing region and an environmental processing region; the step of determining the region to be processed corresponding to the processing object according to the plurality of key points specifically includes:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object; and determining an environment processing area positioned at the periphery of the target processing area according to the target processing area.
In practical applications, under the condition that pixel points in a certain region in an image are processed and pixel points outside the region are kept in an original state, the processed image may appear unnatural and have obvious change traces, so that corresponding processing needs to be performed on part of the pixel points outside the region.
The method of this embodiment further divides the region to be processed into a target processing region and an environment processing region, and since the target processing region is determined according to the contour and/or shape of the processing object, the target processing region is the region most closely attached to the processing object, and a partial region located at the periphery of the target processing region is determined as the environment processing region, and different processing rules can be adopted for processing the pixel points in the target processing region and the pixel points in the environment processing region, so as to weaken the trace of change. In a colloquial way, the environment processing area can be understood as a buffer area, pixels in the target processing area need to be processed with a relatively large change degree, and pixels in the environment processing area can only be adaptively adjusted.
Optionally, the target processing region is an elliptical target processing region, and the environment processing region is an elliptical ring-shaped environment processing region located at the periphery of the elliptical target processing region. Therefore, the to-be-processed area comprises an elliptical target processing area and an elliptical annular environment processing area.
The step of determining the target area shape and the target area range of the target processing area corresponding to the processing object included in the image according to the contour and/or the shape of the processing object specifically includes:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the step of determining the environmental area shape and the environmental area range of the environmental processing area located at the periphery of the target processing area according to the target area shape and the target area range of the target processing area specifically includes:
determining the center of a target circle as the center of an ellipse, determining an ellipse transverse axis and/or an ellipse longitudinal axis according to a target transverse axis and/or a target longitudinal axis, and determining an ellipse to-be-processed area according to the center of the ellipse, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area. The length of the horizontal axis of the ellipse is a first preset multiple of the length of the target horizontal axis, and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
The method comprises the steps of determining a target circle center according to a plurality of key points of a processing object, specifically determining a central key point corresponding to the processing object as the target circle center, then determining the position direction and the length of a target horizontal axis and the position direction and the length of a target longitudinal axis according to the position relation and the distance between the target circle center and each key point, and further determining the area range of a target processing area according to the target circle center, the target horizontal axis and the target longitudinal axis. For example, when the object to be processed is an eye portion, the eye center key point is determined as the target circle center, the line segment connecting the eye center key point to the eye corner key points on either side of the eye portion is determined as the target horizontal axis, and the line segment connecting the eye center key point and the upper or lower side boundary key point of the eye is determined as the target vertical axis. And finally, determining an elliptical target processing area according to the circle center, the target horizontal axis and the target longitudinal axis.
Then, the elliptical target processing area is determined according to the elliptical target processing area, and as can be seen from the above, the elliptical target processing area and the elliptical target processing area are concentric elliptical areas, specifically, the target horizontal axis can be coincident with the elliptical horizontal axis, and the target longitudinal axis coincides with the ellipse longitudinal axis and the length of the ellipse transverse axis is set to a first preset multiple of the length of the target transverse axis and the length of the ellipse longitudinal axis is set to a second preset multiple of the length of the target longitudinal axis, wherein the first preset multiple and the second preset multiple may be the same, in this case, the ratio of the horizontal axis to the vertical axis of the target is equal to the ratio of the horizontal axis to the vertical axis of the ellipse, and the first preset multiple and the second preset multiple may also be different, however, at least one of the values of the two values cannot be smaller than 1, that is, the area range of the elliptical region to be processed is larger than that of the target processing region. An elliptical ring-shaped environmental processing area located at the periphery of the elliptical target processing area is further determined according to the elliptical to-be-processed area and the elliptical target processing area, that is, the elliptical ring-shaped environmental area is an elliptical ring area located between the elliptical target processing area and the elliptical to-be-processed area.
Step S220, determining an original abscissa value and an original ordinate value of each pixel point in the image.
In specific application, a coordinate system can be established in an image, an original abscissa value and an original ordinate value of a pixel point in the coordinate system are calculated for each pixel point in the image, specifically, a regional coordinate system is established in the image according to a region to be processed, and an original abscissa value and an original coordinate value of the pixel point in the regional coordinate system are calculated for each pixel point in the image. Optionally, when the to-be-processed area is an elliptical to-be-processed area, determining a center of an ellipse as an origin of coordinates of a coordinate system in advance, determining a transverse coordinate axis of the coordinate system according to a transverse axis of the ellipse, determining a longitudinal coordinate axis of the coordinate system according to a longitudinal axis of the ellipse, and calculating an original transverse coordinate value and an original longitudinal coordinate value of the pixel point according to the coordinate system. Namely, the center of the target circle is used as the origin of coordinates, the transverse coordinate axis of the coordinate system is determined according to the target transverse axis, the longitudinal coordinate axis of the coordinate system is determined according to the target longitudinal axis, and the original transverse coordinate value and the original longitudinal coordinate value of the pixel point are calculated according to the coordinate system.
Step S230, performing scaling processing on the original abscissa value of the pixel point according to the abscissa-ordinate ratio between the length of the horizontal axis of the ellipse and the length of the vertical axis of the ellipse to obtain a scaled abscissa value of the pixel point.
In the method of this embodiment, whether the pixel belongs to the elliptical region to be processed is further determined according to the equivalent circumferential distance by calculating the equivalent circumferential distance between the pixel and the center of the ellipse. Specifically, an abscissa-ordinate ratio of the length of the ellipse abscissa to the length of the ellipse ordinate is calculated, the original abscissa value of the pixel is scaled by the abscissa-ordinate ratio, and the ratio between the original abscissa value and the abscissa-ordinate ratio is determined as the scaled abscissa value of the pixel. In the case that the first preset multiple is equal to the second preset multiple, the scaling abscissa and ordinate values may also be determined according to an abscissa-ordinate ratio of the length of the target abscissa to the length of the target ordinate and the original abscissa value of the pixel point, which is similar to the above manner and is not described herein again.
In addition, when the length of the horizontal axis of the ellipse is equal to the length of the vertical axis of the ellipse, that is, the region to be processed is a circular region to be processed, the horizontal-vertical ratio between the length of the horizontal axis of the ellipse and the length of the vertical axis of the ellipse is 1.
Step S240, calculating an equivalent circumferential distance between the pixel point and the center of the ellipse by using the original ordinate and the scaled abscissa of the pixel point.
In the method of this embodiment, by calculating the equivalent circumferential distance between the pixel point and the center of the ellipse, the elliptical region to be processed, in which the length of the horizontal axis of the ellipse is different from the length of the vertical axis of the ellipse, can be equivalent to a circular region to be processed, and for each pixel point, it is determined whether the pixel point belongs to the region to be processed. In practical applications, the equivalent circumferential distance between the pixel point and the center of the ellipse can be calculated by the following formula:
D=[(x/ratio)2+y2]1/2
wherein D is the equivalent circumferential distance between the pixel point and the center of the ellipse, x is the original abscissa value of the pixel point, y is the original coordinate value of the pixel point, and ratio is the abscissa-ordinate ratio between the length of the horizontal axis of the ellipse and the length of the vertical axis of the ellipse, and then x/ratio is the scaled abscissa value of the pixel point.
Step S250, judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is larger than the length of the longitudinal axis of the ellipse, if not, executing step S260; if so, the method ends.
And judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse or not, wherein the step is to judge whether the pixel point belongs to a region to be processed or not. It should be noted that the equivalent circumferential distance has a corresponding relationship with the object to be compared. Specifically, if the equivalent circumferential distance is calculated by the scaled abscissa value and the original ordinate value of the pixel point obtained by scaling the original abscissa value of the pixel point according to the abscissa-ordinate ratio between the length of the ellipse horizontal axis and the length of the ellipse vertical axis, the step determines whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the ellipse vertical axis; if the equivalent circumferential distance is obtained by calculating the scaled ordinate value and the original abscissa value of the pixel point obtained by scaling the original ordinate value of the pixel point according to the aspect ratio between the length of the ellipse longitudinal axis and the length of the ellipse transverse axis, the step determines whether the equivalent circumferential distance between the pixel point and the ellipse center is greater than the length of the ellipse transverse axis, and a person skilled in the art can adjust the equivalent circumferential distance according to actual conditions. In other words, the horizontal axis and the vertical axis in this embodiment can be interchanged, and accordingly, the horizontal-vertical ratio is converted into the vertical-horizontal ratio, and the present invention is not limited to a specific implementation manner.
Step S260, determining that the pixel belongs to the to-be-processed region, determining translation distance information corresponding to the pixel according to an equivalent circumferential distance between the pixel and the center of the ellipse circle of the to-be-processed region, and translating the pixel to a target position according to the translation distance information.
If the equivalent circumferential distance between the pixel point and the center of the ellipse is judged to be not more than the length of the longitudinal axis of the ellipse, the pixel point is determined to belong to the area to be processed, and the pixel point is processed according to a preset translation rule.
In this embodiment, the method of the present invention is specifically described by taking the to-be-processed area as an elliptical to-be-processed area as an example, and it can be understood by those skilled in the art that the method of the present invention is not limited to the area shape or the outline of the to-be-processed area, and accordingly, the manner of determining whether a pixel point belongs to the to-be-processed area and the manner of determining the translation distance information corresponding to the pixel point are also different depending on the area shape of the to-be-processed area. For example, for an application scenario in which the to-be-processed area is a circular to-be-processed area, determining whether a pixel belongs to the to-be-processed area specifically includes: judging whether the distance between the pixel point and the center of the circular region to be processed is larger than the length of the radius of the circular region to be processed, and if not, determining that the pixel point belongs to the region to be processed. And further determining corresponding translation distance information according to the distance between the pixel point and the center of the area circle, and performing translation processing on the pixel point according to the translation distance information.
As described in the foregoing, the determining the translation distance information corresponding to the pixel point in this embodiment further includes: and determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed.
The method specifically comprises the following steps: when the pixel point is determined to belong to a target processing area according to the distance between the pixel point and the area center of the area to be processed, translation distance information corresponding to the pixel point is determined according to a preset target processing rule; and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
That is, if the distance between the pixel point and the center of the ellipse is not greater than the length of the longitudinal axis of the ellipse, that is, the pixel point belongs to the region to be processed, it is further determined whether the pixel point belongs to the target processing region, specifically, whether the distance between the pixel point and the center of the target center is not greater than the length of the longitudinal axis of the target or the transverse axis of the target, if not, the pixel point is determined to belong to the target processing region, and then the pixel point is processed according to the preset target processing rule; if yes, determining that the pixel point belongs to the environment processing area, and processing the pixel point according to a preset environment processing rule.
Correspondingly, when the to-be-processed area is an elliptical to-be-processed area, the step of determining whether the pixel point belongs to the target processing area according to the distance between the pixel point and the area center of the to-be-processed area specifically comprises the following steps:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area; if yes, determining that the pixel point belongs to the environment processing area.
Further, if the pixel point is judged to belong to the target processing area, determining translation distance information corresponding to the pixel point according to a preset target processing rule; and if the pixel point is judged to belong to the environment processing area, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
Wherein the target processing rule and the environment processing rule can be determined by the following method:
the method comprises the steps that when the equivalent circumferential distance from a pixel point to the center of an ellipse is not larger than the length of a target longitudinal axis, a first mapping relation between translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse is determined in advance, and a target processing rule is determined according to the first mapping relation; and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
The first mapping relation can be determined according to a first translation distance lookup table, a horizontal axis of the first translation distance lookup table represents an equivalent circumferential distance between a pixel point and a target circle center, a vertical axis of the first translation distance lookup table represents a translation distance of the pixel point, the first translation distance lookup table can indicate differences between translation distances corresponding to a plurality of pixel points with different equivalent circumferential distances to the circle center of an ellipse, and the mapping rule of the first mapping relation is as follows: when the equivalent circumferential distance from the pixel point to the center of the target circle is not more than the length of the target longitudinal axis, in the process that the equivalent circumferential distance between the pixel point and the center of the target circle is increased from small to large, the translation distance of the pixel point is firstly kept at a fixed value and then is gradually reduced from the fixed value, so that the smooth transition between an unprocessed area and a processed area can be ensured.
The second mapping relationship may be determined according to a second translation distance lookup table, a horizontal axis of the second translation distance lookup table represents an equivalent circumferential distance between the pixel point and the center of the target circle, a vertical axis represents a translation distance of the pixel point, the second translation distance lookup table may indicate a difference between translation distances corresponding to a plurality of pixel points having different equivalent circumferential distances to the center of the ellipse, and a mapping rule of the second mapping relationship is as follows: when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, the translation distance of the pixel point is gradually reduced from a fixed value in the process that the distance between the pixel point and the center of the target center is increased from small to large, and smooth transition between an unprocessed area and a processed area can be ensured through the mode. In addition, according to the method of the embodiment, by setting the object processing rules corresponding to different processing areas, the change traces of the image can be weakened, and the aesthetic feeling of the image can be improved. For convenience of understanding, fig. 5 shows a schematic diagram of a translation distance lookup table in a form, as shown in fig. 5, for a pixel point in a to-be-processed region, in a process that an equivalent circumferential distance between the pixel point and a center of a target circle is changed from small to large, a rotation angle of the pixel point is first maintained at a fixed value, and then is gradually reduced from the fixed value to zero, where R may specifically refer to a length of an elliptical longitudinal axis or a transverse axis of an ellipse of the to-be-processed region. In addition, as can be understood by those skilled in the art, the present invention sets the target processing rule and the environment processing rule for the purpose of: the target processing area and the environment processing area are distinguished, so that smooth transition can be better realized, and therefore, the invention does not limit the specific contents of the target processing rule and the environment processing rule (namely, the specific forms of the first mapping relation and the second mapping relation can be flexibly adjusted). Then, translating the pixel point to a target position according to the translation distance information, specifically, calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point; and determining a target position according to the target coordinate value, and translating the pixel point to the target position.
The translation processing of the pixel points can be performed in the original image, that is, for each pixel point in the region to be processed, translation distance information corresponding to the pixel point is determined according to the equivalent circumferential distance between the pixel point and the center of the ellipse circle, a target position is determined according to the translation distance information and the original coordinate value of the pixel point, and the translation of the pixel point to the target position can be realized by assigning the coordinate value of the pixel point to the pixel point located at the target position. In addition, the translating process of the pixel point can be realized according to the original image and the newly created blank image, specifically, the target position is determined according to the target coordinate value, and the step of translating the pixel point to the target position specifically includes: and creating a blank image corresponding to the original image in advance, and determining the target position in the blank image according to the target coordinate value.
The method is to create a blank image corresponding to an original image and perform pixel value assignment on pixel points contained in the blank image, and since target coordinate values of the pixel points are calculated according to the original coordinate values of the pixel points, a coordinate system in the blank image needs to be consistent with a coordinate system in the image, so that the accuracy of a determined target position is ensured. Therefore, according to the mode, aiming at the pixel points which do not belong to the area to be processed in the image, the target coordinate values of the pixel points are consistent with the original coordinate values, and the pixel values of the pixel points are directly assigned to the pixel points which are positioned at the target positions corresponding to the target coordinate values; and aiming at the pixel points belonging to the to-be-processed area in the image, determining a target position according to the translation distance information corresponding to the pixel points, and assigning the pixel values of the pixel points to the pixel points positioned at the target position.
Further, in order to adapt to various application scenarios of image processing, for example, when the processing objects are the first and second eye portions, in order to reduce the inter-eye distance, the first eye portion and the second eye portion need to be moved towards the middle, so after the translation distance of the pixel point is determined, the translation direction of the pixel point needs to be determined, in addition, since the first translation distance lookup table and the second translation distance lookup table specify the corresponding relationship between the equivalent circumferential distance and the translation distance and are fixed, in practical application, the translation distance information often needs to be corrected according to the actual situation, and the translation distance information of the pixel point needs to be dynamically adjusted, so as to achieve a better processing effect. Based on this, the translation distance information corresponding to the pixel point is further corrected by using the correction factor in this embodiment.
Specifically, after the step of determining the translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, the method further includes:
determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information; the step of translating the pixel point to the target position according to the translation distance information specifically includes: and translating the pixel point to a target position according to the translation correction information. Wherein the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
The correction factor is used for determining the translation direction of the pixel point and/or the translation amplitude of the pixel point, and the correction factor can be dynamically adjusted according to actual conditions, so that in specific application, the target coordinate value of the pixel point can be calculated according to the following formula:
x=x’+x_scale*L
y=y’+y_scale*L
the above formula shows a manner of correcting the translation distance information by using the transverse correction factor and the longitudinal correction factor, and calculating a target coordinate value according to the corrected translation distance information, wherein L is a translation distance determined according to the equivalent circumferential distance; x _ scale is a transverse correction factor, y _ scale is a longitudinal correction factor, and respectively represents the translation amplitude of the pixel point in the transverse direction and the longitudinal direction, and the larger the absolute value is, the larger the translation amplitude is, in addition, if x _ scale is a positive value, the translation of the pixel point in the positive direction in the transverse direction is indicated, if x _ scale is a positive value, the translation of the pixel point in the negative direction in the transverse direction is indicated, and y _ scale is the same; x 'is the original abscissa value of the pixel point, and y' is the original ordinate value of the pixel point; x is the target abscissa value of the pixel point, and y is the target ordinate value of the pixel point.
And determining the target coordinate value of the pixel point according to the formula, further determining a target position corresponding to the target coordinate value, assigning the pixel value of the pixel point to the pixel point positioned at the target position until each pixel point in the region to be processed is subjected to translation processing, and completing the operation of translating the processing object in the image.
Therefore, the method can achieve the effect of translating the processing object in the image; secondly, determining a corresponding region to be processed according to a processing object in the image, so that the region to be processed can be more fit with the outline or the shape of the processing object, and setting the region shape of the region to be processed into an ellipse in a face image processing scene can enable the region to be processed to be more matched with a face or five-sense organs; secondly, all pixel points in the image do not need to be processed in the mode, and only the pixel points in the region to be processed are processed, so that the calculation amount can be reduced, and the image processing speed can be improved; in addition, the region to be processed is divided into a target processing region and an environment processing region, and different object processing rules are adopted for the pixel points in different processing regions, so that the change trace of the image can be weakened, and the aesthetic feeling of the image can be improved; finally, the translation distance information corresponding to the pixel points can be dynamically adjusted by utilizing the correction factors, so that a better processing effect can be obtained, and the method is used in various application scenes.
In addition, the inventor finds out in the process of implementing the invention that: the deformation coefficient of the pixel point in the target processing area and the equivalent circumferential distance from the pixel point to the center of the target circle have a first mapping relation, and the deformation coefficient (namely translation distance information) of the pixel point in the environment processing area and the equivalent circumferential distance from the pixel point to the center of the target circle have a second mapping relation. Based on the first mapping relationship being different from the second mapping relationship, in the present embodiment, the region to be processed is further divided into the target processing region and the environment processing region, and the target processing rule and the environment processing rule are set, so that different processes can be respectively performed on the actual processing object and the peripheral region of the processing object, thereby further improving the processing effect.
In addition, in an actual situation, both the first lookup table corresponding to the first mapping relationship and the second lookup table corresponding to the second mapping relationship are used to define the mapping relationship between the deformation coefficient of the pixel point and the distance from the pixel point to the center of the circle, that is: in general, a fixed mapping relationship exists between the deformation coefficient of a pixel and the position of the pixel relative to the center of a circle. However, in this embodiment, since the to-be-processed region is an ellipse (instead of a circle), a determination method of the equivalent circumferential distance is provided to more conveniently and accurately use the first lookup table and the second lookup table, and to more conveniently and accurately determine the position relationship of the pixel point with respect to the center of the circle. Through the equivalent circumferential distance, the deformation coefficient of each pixel point in the oval processing area can be determined quickly and accurately. In addition, when the method in the embodiment is implemented by the GPU, the processing efficiency can be greatly improved because the GPU has the advantage of parallel processing.
Fig. 3 shows a schematic configuration diagram of an image processing apparatus according to still another embodiment of the present invention, as shown in fig. 3, the apparatus including:
a key point detection module 31 adapted to detect a plurality of key points corresponding to a processing object in an image;
a to-be-processed region determining module 32 adapted to determine a to-be-processed region corresponding to the processing object according to the plurality of key points;
the judging module 33 is adapted to judge, for each pixel point in the image, whether the pixel point belongs to a region to be processed;
a translation distance information determining module 34, adapted to determine translation distance information corresponding to the pixel point according to a distance between the pixel point and a region center of the region to be processed if yes;
and the translation module 35 is adapted to translate the pixel point to the target position according to the translation distance information.
Optionally, the to-be-processed region is an elliptical to-be-processed region, and the elliptical to-be-processed region is determined by an ellipse center, an ellipse horizontal axis and an ellipse vertical axis.
Optionally, the apparatus further comprises:
the coordinate value determining module is suitable for determining an original abscissa value and an original ordinate value of each pixel point in the image;
the zooming module is suitable for zooming the original abscissa value of the pixel point according to the abscissa-ordinate ratio between the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a zoomed abscissa value of the pixel point;
the equivalent circumferential distance calculation module is suitable for calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by utilizing the original longitudinal coordinate value and the scaled horizontal coordinate value of the pixel point;
the decision module 33 is further adapted to: judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse or not; if not, determining that the pixel point belongs to the area to be processed.
Optionally, the coordinate value determination module is further adapted to:
determining the center of an ellipse as the origin of a target coordinate system in advance, determining the target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining the target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis;
and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
Optionally, the area to be treated further comprises: a target processing region and an environmental processing region; the pending area determination module 32 is further adapted to:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object;
determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
and, the translation distance information determination module 34 is further adapted to:
when the pixel point is determined to belong to a target processing area according to the distance between the pixel point and the area center of the area to be processed, translation distance information corresponding to the pixel point is determined according to a preset target processing rule;
and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
Optionally, the target processing area is an elliptical target processing area, and the environment processing area is an elliptical ring-shaped environment processing area located at the periphery of the elliptical target processing area;
the pending area determination module 32 is further adapted to:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the pending area determination module 32 is further adapted to:
determining the center of a target circle as the center of an ellipse, determining an ellipse transverse axis and/or an ellipse longitudinal axis according to a target transverse axis and/or a target longitudinal axis, and determining an ellipse to-be-processed area according to the center of the ellipse, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
Optionally, the determining module 33 is further adapted to:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
Optionally, the length of the horizontal axis of the ellipse is a first preset multiple of the length of the target horizontal axis, and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
Optionally, the apparatus further comprises:
the mapping relation determining module is suitable for determining a first mapping relation between translation distance information corresponding to a pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is not more than the length of the target longitudinal axis in advance, and the target processing rule is determined according to the first mapping relation;
the mapping determination module is further adapted to: and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
Optionally, the translation module 35 is further adapted to:
calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point;
and determining a target position according to the target coordinate value, and translating the pixel point to the target position.
Optionally, the translation module 35 is further adapted to:
and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
Optionally, the apparatus further comprises:
the correction module is suitable for determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting the translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the translation module 35 is further adapted to: and translating the pixel point to a target position according to the translation correction information.
Optionally, the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
The specific structure and the working principle of each module may refer to the description of the corresponding step in the method embodiment, and are not described herein again.
Yet another embodiment of the present application provides a non-volatile computer storage medium storing at least one executable instruction that can perform the image processing method in any of the above method embodiments.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and the specific embodiment of the present invention does not limit the specific implementation of the electronic device.
As shown in fig. 4, the electronic device may include: a processor (processor)402, a Communications Interface 404, a memory 406, and a Communications bus 408.
Wherein:
the processor 402, communication interface 404, and memory 406 communicate with each other via a communication bus 408.
A communication interface 404 for communicating with network elements of other devices, such as clients or other servers.
The processor 402 is configured to execute the program 410, and may specifically perform relevant steps in the above-described embodiment of the image processing method.
In particular, program 410 may include program code comprising computer operating instructions.
The processor 402 may be a central processing unit CPU or an application Specific Integrated circuit asic or one or more Integrated circuits configured to implement embodiments of the present invention. The electronic device comprises one or more processors, which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 406 for storing a program 410. Memory 406 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 410 may specifically be configured to cause the processor 402 to perform the following operations: detecting a plurality of key points corresponding to a processing object in the image, and determining a region to be processed corresponding to the processing object according to the plurality of key points; aiming at each pixel point in the image, judging whether the pixel point belongs to a region to be processed; if so, determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to the target position according to the translation distance information.
In an alternative mode, the to-be-processed region is an elliptical to-be-processed region, and the elliptical to-be-processed region is determined by the center of an ellipse, the transverse axis of the ellipse and the longitudinal axis of the ellipse.
In an optional manner, the program 410 may be specifically further configured to cause the processor 402 to perform the following operations: aiming at each pixel point in the image, determining an original abscissa value and an original ordinate value of the pixel point; carrying out scaling processing on the original abscissa value of the pixel point according to the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a scaled abscissa value of the pixel point; calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by using the original longitudinal coordinate value and the scaled horizontal coordinate value of the pixel point; judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse or not; if not, determining that the pixel point belongs to the area to be processed.
In an optional manner, the program 410 may be specifically further configured to cause the processor 402 to perform the following operations: determining the center of an ellipse as the origin of a target coordinate system in advance, determining the target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining the target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis; and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
In an alternative manner, the target processing area and the environment processing area, the program 410 may be further specifically configured to cause the processor 402 to perform the following operations: determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object; determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
the program 410 may be further specifically configured to cause the processor 402 to perform the following operations: when the pixel point is determined to belong to a target processing area according to the distance between the pixel point and the area center of the area to be processed, translation distance information corresponding to the pixel point is determined according to a preset target processing rule; and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
In an optional manner, the target processing region is an elliptical target processing region, and the environment processing region is an elliptical ring-shaped environment processing region located at the periphery of the elliptical target processing region;
the program 410 may be further specifically configured to cause the processor 402 to perform the following operations: determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the program 410 may be further specifically configured to cause the processor 402 to perform the following operations: determining the center of a target circle as the center of an ellipse, determining an ellipse transverse axis and/or an ellipse longitudinal axis according to a target transverse axis and/or a target longitudinal axis, and determining an ellipse to-be-processed area according to the center of the ellipse, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
In an alternative manner, the program 410 may be specifically configured to cause the processor 402 to perform the following operations: judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
In an alternative mode, the length of the horizontal axis of the ellipse is a first preset multiple of the length of the target horizontal axis, and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
In an alternative manner, the program 410 may be specifically configured to cause the processor 402 to perform the following operations: the method comprises the steps that when the equivalent circumferential distance from a pixel point to the center of an ellipse is not larger than the length of a target longitudinal axis, a first mapping relation between translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse is determined in advance, and a target processing rule is determined according to the first mapping relation;
and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
In an alternative manner, the program 410 may be specifically configured to cause the processor 402 to perform the following operations: calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point; and determining a target position according to the target coordinate value, and translating the pixel point to the target position.
In an alternative manner, the program 410 may be specifically configured to cause the processor 402 to perform the following operations: and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
In an alternative manner, the program 410 may be specifically configured to cause the processor 402 to perform the following operations:
determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the step of translating the pixel point to the target position according to the translation distance information specifically includes: and translating the pixel point to a target position according to the translation correction information.
In an optional manner, the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in an image processing apparatus according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.

Claims (27)

1. An image processing method comprising:
detecting a plurality of key points corresponding to a processing object in an image, and determining a region to be processed corresponding to the processing object according to the plurality of key points;
aiming at each pixel point in the image, judging whether the pixel point belongs to the area to be processed;
if so, determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, and translating the pixel point to a target position according to the translation distance information;
after the step of determining the translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed, the method further includes:
determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the step of translating the pixel point to the target position according to the translation distance information specifically includes: and translating the pixel point to a target position according to the translation correction information.
2. The method of claim 1, wherein the area to be processed is an elliptical area to be processed, and the elliptical area to be processed is determined by an ellipse center, an ellipse transverse axis and an ellipse longitudinal axis.
3. The method according to claim 2, wherein the step of determining, for each pixel point in the image, whether the pixel point belongs to the region to be processed specifically includes:
aiming at each pixel point in the image, determining an original abscissa value and an original ordinate value of the pixel point;
carrying out scaling processing on the original abscissa value of the pixel point according to the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a scaled abscissa value of the pixel point;
calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by using the original ordinate value and the scaled abscissa value of the pixel point;
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse; if not, determining that the pixel point belongs to the area to be processed;
wherein the equivalent circumferential distance between the pixel point and the center of the ellipse can be calculated by the following formula: d ═ x/ratio)2+y2]1/2
Wherein D is the equivalent circumferential distance between the pixel point and the center of the ellipse, x is the original abscissa value of the pixel point, y is the original coordinate value of the pixel point, and ratio is the ratio of the length of the horizontal axis of the ellipse to the length of the vertical axis of the ellipse, and then x/ratio is the scaled abscissa value of the pixel point.
4. The method according to claim 3, wherein the step of determining, for each pixel point in the image, an original abscissa value and an original ordinate value of the pixel point specifically includes:
determining the center of the ellipse as the origin of a target coordinate system in advance, determining a target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining a target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis;
and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
5. The method according to any one of claims 2-4, wherein the area to be treated further comprises: a target processing region and an environmental processing region; the step of determining the region to be processed corresponding to the processing object according to the plurality of key points specifically includes:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object;
determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
and the step of determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed specifically includes:
when the pixel point is determined to belong to the target processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset target processing rule;
and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
6. The method of claim 5, wherein the target processing region is an elliptical target processing region and the environmental processing region is an elliptical ring-shaped environmental processing region located at a periphery of the elliptical target processing region;
the step of determining a target area shape and a target area range of a target processing area corresponding to the processing object, which are included in the image, according to the contour and/or the shape of the processing object specifically includes:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the step of determining an environment region shape and an environment region range of an environment processing region located at the periphery of the target processing region according to the target region shape and the target region range of the target processing region specifically includes:
determining the target circle center as the ellipse circle center, determining the ellipse transverse axis and/or the ellipse longitudinal axis according to the target transverse axis and/or the target longitudinal axis, and determining the ellipse to-be-processed area according to the ellipse circle center, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
7. The method according to claim 6, wherein the step of determining that the pixel point belongs to the target processing region according to the distance between the pixel point and the region center of the to-be-processed region specifically comprises:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
8. The method of claim 7, wherein the length of the transverse axis of the ellipse is a first preset multiple of the length of the target transverse axis and/or the length of the longitudinal axis of the ellipse is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
9. The method of claim 8, wherein the method further comprises:
the method comprises the steps that a first mapping relation between translation distance information corresponding to a pixel point and the equivalent circumferential distance from the pixel point to the center of an ellipse is determined in advance when the equivalent circumferential distance from the pixel point to the center of the ellipse is not larger than the length of a target longitudinal axis, and a target processing rule is determined according to the first mapping relation;
and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
10. The method according to claim 9, wherein the step of translating the pixel point to the target position according to the translation distance information specifically includes:
calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point;
and determining the target position according to the target coordinate value, and translating the pixel point to the target position.
11. The method according to claim 10, wherein the step of determining the target position according to the target coordinate value and translating the pixel point to the target position specifically comprises:
and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
12. The method of claim 11, wherein the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
13. The method of any of claims 6-12, wherein the method is implemented by a graphics processor.
14. An image processing apparatus comprising:
the key point detection module is suitable for detecting a plurality of key points corresponding to a processing object in the image;
a to-be-processed region determining module, adapted to determine a to-be-processed region corresponding to the processing object according to the plurality of key points;
the judging module is suitable for judging whether each pixel point in the image belongs to the area to be processed;
the translation distance information determining module is suitable for determining translation distance information corresponding to the pixel point according to the distance between the pixel point and the area center of the area to be processed if the pixel point is located in the area to be processed;
the translation module is suitable for translating the pixel point to a target position according to the translation distance information;
wherein the apparatus further comprises:
the correction module is suitable for determining a correction factor corresponding to the pixel point according to a preset translation correction rule, and correcting the translation distance information corresponding to the pixel point according to the correction factor to obtain translation correction information;
the translation module is further adapted to: and translating the pixel point to a target position according to the translation correction information.
15. The apparatus of claim 14, wherein the area to be processed is an elliptical area to be processed, and the elliptical area to be processed is determined by an ellipse center, an ellipse transverse axis, and an ellipse longitudinal axis.
16. The apparatus of claim 15, wherein the apparatus further comprises:
the coordinate value determining module is suitable for determining an original abscissa value and an original ordinate value of each pixel point in the image;
the zooming module is suitable for zooming the original abscissa value of the pixel point according to the abscissa-ordinate ratio between the length of the ellipse transverse axis and the length of the ellipse longitudinal axis to obtain a zoomed abscissa value of the pixel point;
the equivalent circumferential distance calculation module is suitable for calculating the equivalent circumferential distance between the pixel point and the center of the ellipse by utilizing the original longitudinal coordinate value and the scaled horizontal coordinate value of the pixel point;
the determination module is further adapted to: judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the longitudinal axis of the ellipse; if not, determining that the pixel point belongs to the area to be processed;
wherein the equivalent circumferential distance between the pixel point and the center of the ellipse can be calculated by the following formula: d ═ x/ratio)2+y2]1/2
Wherein D is the equivalent circumferential distance between the pixel point and the center of the ellipse, x is the original abscissa value of the pixel point, y is the original coordinate value of the pixel point, and ratio is the ratio of the length of the horizontal axis of the ellipse to the length of the vertical axis of the ellipse, and then x/ratio is the scaled abscissa value of the pixel point.
17. The apparatus of claim 16, wherein the coordinate value determination module is further adapted to:
determining the center of the ellipse as the origin of a target coordinate system in advance, determining a target transverse coordinate axis of the target coordinate system according to the ellipse transverse axis, and determining a target longitudinal coordinate axis of the target coordinate system according to the ellipse longitudinal axis;
and calculating the original abscissa value and the original ordinate value of the pixel point according to the target coordinate system.
18. The apparatus of any of claims 15-17, wherein the area to be treated further comprises: a target processing region and an environmental processing region; the pending area determination module is further adapted to:
determining a target processing area corresponding to the processing object contained in the image according to the contour and/or the shape of the processing object;
determining an environment processing area positioned at the periphery of the target processing area according to the target processing area;
and, the translation distance information determination module is further adapted to:
when the pixel point is determined to belong to the target processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset target processing rule;
and when the pixel point is determined to belong to the environment processing area according to the distance between the pixel point and the area center of the area to be processed, determining translation distance information corresponding to the pixel point according to a preset environment processing rule.
19. The apparatus of claim 18, wherein the target processing region is an elliptical target processing region and the environmental processing region is an elliptical ring-shaped environmental processing region located at a periphery of the elliptical target processing region;
the pending area determination module is further adapted to:
determining a target circle center and a target transverse axis and/or a target longitudinal axis passing through the target circle center according to the plurality of key points, and determining a target area shape and a target area range of the oval target processing area according to the target circle center and the target transverse axis and/or the target longitudinal axis passing through the target circle center;
the pending area determination module is further adapted to:
determining the target circle center as the ellipse circle center, determining the ellipse transverse axis and/or the ellipse longitudinal axis according to the target transverse axis and/or the target longitudinal axis, and determining the ellipse to-be-processed area according to the ellipse circle center, the ellipse transverse axis and the ellipse longitudinal axis; and determining an elliptical annular environment processing area positioned at the periphery of the elliptical target processing area according to the elliptical to-be-processed area and the elliptical target processing area.
20. The apparatus of claim 19, wherein the determining module is further adapted to:
judging whether the equivalent circumferential distance between the pixel point and the center of the ellipse is greater than the length of the target longitudinal axis or not; if not, determining that the pixel point belongs to the target processing area.
21. The apparatus according to claim 20, wherein the length of the transverse ellipse axis is a first preset multiple of the length of the target transverse axis and/or the length of the longitudinal ellipse axis is a second preset multiple of the length of the target longitudinal axis; and the first preset multiple and/or the second preset multiple are/is not less than 1.
22. The apparatus of claim 21, wherein the apparatus further comprises:
the mapping relation determining module is suitable for determining a first mapping relation between translation distance information corresponding to a pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is not more than the length of the target longitudinal axis in advance, and the target processing rule is determined according to the first mapping relation;
the mapping determination module is further adapted to: and predetermining a second mapping relation between the translation distance information corresponding to the pixel point and the equivalent circumferential distance from the pixel point to the center of the ellipse when the equivalent circumferential distance from the pixel point to the center of the ellipse is greater than the length of the target longitudinal axis and less than the length of the ellipse longitudinal axis, and determining the environment processing rule according to the second mapping relation.
23. The apparatus of claim 22, wherein the translation module is further adapted to:
calculating a target coordinate value of the pixel point according to the translation distance information and the original coordinate value of the pixel point;
and determining the target position according to the target coordinate value, and translating the pixel point to the target position.
24. The apparatus of claim 23, wherein the translation module is further adapted to:
and creating a blank image corresponding to the image in advance, and determining the target position in the blank image according to the target coordinate value.
25. The apparatus of claim 24, wherein the correction factor further comprises: a lateral correction factor and a longitudinal correction factor.
26. An electronic device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the image processing method according to any one of claims 1-13.
27. A computer storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the image processing method of any one of claims 1-13.
CN201810229530.1A 2018-03-20 2018-03-20 Image processing method and device and electronic equipment Active CN108389155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810229530.1A CN108389155B (en) 2018-03-20 2018-03-20 Image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810229530.1A CN108389155B (en) 2018-03-20 2018-03-20 Image processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN108389155A CN108389155A (en) 2018-08-10
CN108389155B true CN108389155B (en) 2021-10-01

Family

ID=63067813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810229530.1A Active CN108389155B (en) 2018-03-20 2018-03-20 Image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN108389155B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109285126B (en) * 2018-08-17 2022-09-09 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium
CN110910393B (en) * 2018-09-18 2023-03-24 北京市商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
CN112396553A (en) * 2019-07-30 2021-02-23 北京嗨动视觉科技有限公司 Image moving method, device and system and computer readable medium
CN110611767B (en) * 2019-09-25 2021-08-10 北京迈格威科技有限公司 Image processing method and device and electronic equipment
CN111507896B (en) * 2020-04-27 2023-09-05 抖音视界有限公司 Image liquefaction processing method, device, equipment and storage medium
CN113596314B (en) * 2020-04-30 2022-11-11 北京达佳互联信息技术有限公司 Image processing method and device and electronic equipment
CN111861868B (en) * 2020-07-15 2023-10-27 广州光锥元信息科技有限公司 Image processing method and device for beautifying human images in video
CN113781295B (en) * 2021-09-14 2024-02-27 网易(杭州)网络有限公司 Image processing method, device, equipment and storage medium
CN114995738B (en) * 2022-05-31 2023-06-16 重庆长安汽车股份有限公司 Transformation method, transformation device, electronic equipment, storage medium and program product
CN116579934B (en) * 2023-04-06 2024-04-16 湖南师范大学 Embroidery plate making processing method and system based on edge detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282513A (en) * 2015-10-23 2016-01-27 国网山西省电力公司大同供电公司 Device and method for detecting operation state of ultra-high-voltage transformer in transformer substation based on 3D infrared panoramic image
CN106846255A (en) * 2017-02-23 2017-06-13 北京普及芯科技有限公司 Image rotation implementation method and device
CN107395958A (en) * 2017-06-30 2017-11-24 北京金山安全软件有限公司 Image processing method and device, electronic equipment and storage medium
CN107578380A (en) * 2017-08-07 2018-01-12 北京金山安全软件有限公司 Image processing method and device, electronic equipment and storage medium
CN107730465A (en) * 2017-10-09 2018-02-23 武汉斗鱼网络科技有限公司 Face U.S. face method and device in a kind of image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015186170A (en) * 2014-03-26 2015-10-22 ソニー株式会社 Image processing apparatus and image processing method
US9412176B2 (en) * 2014-05-06 2016-08-09 Nant Holdings Ip, Llc Image-based feature detection using edge vectors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282513A (en) * 2015-10-23 2016-01-27 国网山西省电力公司大同供电公司 Device and method for detecting operation state of ultra-high-voltage transformer in transformer substation based on 3D infrared panoramic image
CN106846255A (en) * 2017-02-23 2017-06-13 北京普及芯科技有限公司 Image rotation implementation method and device
CN107395958A (en) * 2017-06-30 2017-11-24 北京金山安全软件有限公司 Image processing method and device, electronic equipment and storage medium
CN107578380A (en) * 2017-08-07 2018-01-12 北京金山安全软件有限公司 Image processing method and device, electronic equipment and storage medium
CN107730465A (en) * 2017-10-09 2018-02-23 武汉斗鱼网络科技有限公司 Face U.S. face method and device in a kind of image

Also Published As

Publication number Publication date
CN108389155A (en) 2018-08-10

Similar Documents

Publication Publication Date Title
CN108389155B (en) Image processing method and device and electronic equipment
CN108346130B (en) Image processing method and device and electronic equipment
CN108447023B (en) Image processing method and device and electronic equipment
CN108198141B (en) Image processing method and device for realizing face thinning special effect and computing equipment
CN108364254B (en) Image processing method and device and electronic equipment
CN108399599B (en) Image processing method and device and electronic equipment
CN109191395B (en) Image contrast enhancement method, device, equipment and storage medium
JP6947856B2 (en) Object recognition neural network training methods, devices and computing devices
CN111008947B (en) Image processing method and device, terminal equipment and storage medium
CN109784250B (en) Positioning method and device of automatic guide trolley
CN111160232B (en) Front face reconstruction method, device and system
CN110490886B (en) Automatic correction method and system for certificate image under oblique viewing angle
CN107959798B (en) Video data real-time processing method and device and computing equipment
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN113947768A (en) Monocular 3D target detection-based data enhancement method and device
US20220319145A1 (en) Image processing device, image processing method, moving device, and storage medium
CN114037992A (en) Instrument reading identification method and device, electronic equipment and storage medium
JP2021529370A (en) How to determine the orientation of the target, smart operation control methods and devices and equipment
CN110598333A (en) Method and device for determining light source position and electronic equipment
CN108734712B (en) Background segmentation method and device and computer storage medium
CN107767326B (en) Method and device for processing object transformation in image and computing equipment
CN112634298B (en) Image processing method and device, storage medium and terminal
JP6244885B2 (en) Image processing apparatus, image processing method, and program
CN107977644B (en) Image data processing method and device based on image acquisition equipment and computing equipment
CN111739025A (en) Image processing method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant