CN112750139A - Image processing method and device, computing equipment and storage medium - Google Patents

Image processing method and device, computing equipment and storage medium Download PDF

Info

Publication number
CN112750139A
CN112750139A CN202110061765.6A CN202110061765A CN112750139A CN 112750139 A CN112750139 A CN 112750139A CN 202110061765 A CN202110061765 A CN 202110061765A CN 112750139 A CN112750139 A CN 112750139A
Authority
CN
China
Prior art keywords
control points
initial control
fitting
control point
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110061765.6A
Other languages
Chinese (zh)
Inventor
丁洪利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110061765.6A priority Critical patent/CN112750139A/en
Publication of CN112750139A publication Critical patent/CN112750139A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/203Drawing of straight lines or curves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Abstract

The application provides an image processing method, which comprises the following steps: acquiring a plurality of initial control points representing the original contour of a target image area, wherein each initial control point has a coordinate; fitting the plurality of initial control points to obtain fitting coordinates of at least some of the initial control points, and performing coordinate updating on the plurality of initial control points by updating coordinates of the corresponding initial control points based on the fitting coordinates; based on the coordinates of the initial control points after the coordinate updating, removing the initial control points which have repeated contribution to the formation of the smooth contour from the initial control points to obtain the expected control points; based on the desired control points, a smooth contour of the target image region is determined. The method may be implemented based on artificial intelligence. The target image area may be, for example, a water-based area, and the generated smooth contour may be applied to a map. By this, the number of control points required to determine a smooth contour is low while generating a smooth contour that is smoother than the original contour.

Description

Image processing method and device, computing equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method, an image processing apparatus, a computing device, and a computer-readable storage medium.
Background
Images may typically comprise different regions characterizing different objects, and in image processing there is often involved a need to generate an image or image data comprising an outline of such an object region. For example, a map may include areas that represent water systems, greenbelts, roads, building groups, etc., the shape of which may be determined based on corresponding contours derived from, for example, satellite images. However, it takes a lot of time and labor cost to obtain such a contour by manual labeling, and the contour obtained by the conventional contour extraction algorithm is not smooth enough, and has more contour control points, which occupy higher storage space and transmission bandwidth. Thus, how to obtain a satisfactory contour of a region characterizing a particular object in an image is an important direction of research in the field of image processing.
Disclosure of Invention
In view of the above, the present application provides an image processing method, an image processing apparatus, a computing device and a computer readable storage medium, which may alleviate, reduce or even eliminate the above-mentioned problems.
According to an aspect of the present application, there is provided an image processing method including: acquiring a plurality of initial control points representing the original contour of a target image area, wherein each initial control point in the plurality of initial control points has a coordinate; fitting the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and performing coordinate updating on the plurality of initial control points by updating coordinates of the respective initial control points based on the fitted coordinates; based on the coordinates of the plurality of initial control points after the coordinate updating, removing the initial control points which have repeated contribution to the formation of the smooth contour from the plurality of initial control points to obtain the expected control points; based on the desired control point, a smooth contour of the target image region is determined.
In some embodiments, said fitting the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and performing the coordinate update on the plurality of initial control points by updating the coordinates of the respective initial control points based on the fitted coordinates comprises: fitting the initial control points in the preset fitting window while sliding the preset fitting window by preset step length to obtain the fitting coordinate of at least one initial control point in the fitting window, and updating the coordinate of the corresponding initial control point in the at least one initial control point based on the fitting coordinate.
In some embodiments, the coordinates comprise a first coordinate value in a first dimension direction, and wherein the fitting initial control points within a predetermined fitting window while sliding the predetermined fitting window by a predetermined step size to obtain fitted coordinates of at least one initial control point in the fitting window, and updating the coordinates of a respective initial control point in the at least one initial control point based on the fitted coordinates comprises: and fitting based on the serial number of the initial control point and a first coordinate value of the initial control point in the first preset fitting window to obtain a fitted first coordinate value of at least one initial control point, and updating the first coordinate value of the corresponding initial control point based on the obtained fitted first coordinate value to update the coordinate of the corresponding initial control point.
In some embodiments, the coordinates further include a second coordinate value in a second dimension direction, and wherein the fitting initial control points within the predetermined fitting window while sliding the predetermined fitting window by the predetermined step size to obtain fitted coordinates of at least one initial control point in the fitting window, and updating the coordinates of a corresponding initial control point in the at least one initial control point based on the fitted coordinates further includes: and fitting the initial control points and second coordinate values of the initial control points in the second preset fitting window based on the serial numbers of the initial control points to obtain fitted second coordinate values of at least one initial control point while sliding the second preset fitting window by a second preset step length, and updating the second coordinate values of the corresponding initial control points based on the obtained fitted second coordinate values to update the coordinates of the corresponding initial control points.
In some embodiments, said fitting initial control points within a predetermined fitting window while sliding the predetermined fitting window by a predetermined step size to obtain fitted coordinates of at least one initial control point in the fitting window comprises: and fitting the initial control points in the preset fitting window while sliding the preset fitting window by preset step length to obtain the fitting coordinates of the central initial control point in the fitting window.
In some embodiments, said fitting the plurality of initial control points to obtain the fitted coordinates of at least some of the plurality of initial control points comprises: fitting the plurality of initial control points by a least squares method to obtain a fitted curve, and determining fitted coordinates of at least some of the plurality of initial control points based on the fitted curve.
In some embodiments, said obtaining a plurality of initial control points characterizing an original contour of the target image region comprises: acquiring a plurality of original control points representing original contours of a target image area, wherein each original control point has a coordinate; and interpolating the plurality of original control points based on the coordinates of the original control points to obtain the plurality of initial control points.
In some embodiments, the coordinates have at least two dimensions, and the interpolating the plurality of original control points based on the coordinates of the original control points to obtain the plurality of initial control points comprises: in the direction of a first dimension in two dimensions, when the distance between adjacent original control points is greater than a first threshold distance, inserting at least one first interpolation control point between the adjacent original control points at equal intervals; in the direction of a second dimension of the two dimensions, when the distance between adjacent transition control points in a plurality of transition control points is greater than a second threshold distance, inserting at least one second interpolation control point between the adjacent transition control points at equal intervals, wherein the plurality of transition control points comprise the plurality of original adjacent control points and the first interpolation control point; determining the plurality of transition control points and the second interpolated control point as the plurality of initial control points.
In some embodiments, the obtaining original control points characterizing an original contour of the target region comprises: acquiring an original image comprising the target image area; performing semantic segmentation on the original image to generate a binary mask map comprising the target image area; and extracting a plurality of original control points on the original contour of the target image area from the binary mask image.
In some embodiments, said removing initial control points from said plurality of initial control points that have repeated contributions to the formation of a smooth contour comprises: for the plurality of initial control points, performing at least one round of removal process, and in each round of removal process, removing initial control points having repeated contributions while sliding a predefined removal window by a predetermined removal step, based on one of: when the angle formed by any three continuous unremoved initial control points in the removal window is larger than the threshold angle, removing the middle initial control point in the three initial control points as an initial control point with repeated contribution; when the area of a triangle formed by any three continuous unremoved initial control points in the removal window is smaller than the threshold area, removing the middle initial control point in the three initial control points as an initial control point with repeated contribution; and when the maximum point line distance of the connecting line of each unremoved initial control point and the head-tail initial control point of the screening window in the removing window is smaller than the threshold point line distance, removing the initial control point corresponding to the maximum point line distance as the initial control point with repeated contribution.
In some embodiments, said determining a smooth contour of said target image region based on said desired control points comprises: receiving a corrective action for the desired control point; correcting the desired control point based on the correcting operation; determining a smooth contour of the target image region based on the modified desired control points.
In some embodiments, the receiving a corrective action for the desired control point comprises: receiving at least one of the following corrective actions: adding at least one new desired control point, removing at least one desired control point, and adjusting coordinates of at least one desired control point.
According to another aspect of the present application, there is provided an image processing apparatus including: an acquisition module configured to acquire a plurality of initial control points characterizing an original contour of a target image region, each of the plurality of initial control points having coordinates; a fitting module configured to fit the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and perform coordinate updating on the plurality of initial control points by updating coordinates of respective initial control points based on the fitted coordinates; a removal module configured to remove initial control points having repeated contribution to the formation of the smooth contour from the plurality of initial control points based on the coordinates of the plurality of initial control points after the coordinate update to obtain desired control points; a determination module configured to determine a smooth contour of the target image region based on the desired control point.
According to yet another aspect of the present application, there is provided a computing device comprising: a memory configured to store computer-executable instructions; a processor configured to perform the method recited in the above aspect when the computer-executable instructions are executed by the processor.
According to yet another aspect of the present application, there is provided a computer-readable storage medium storing computer-executable instructions that, when executed, perform the method recited in the above aspect.
The image processing method provided by the application can be used for determining the smooth contour of the target image area in the image. By the method, fitting processing can be performed on the obtained initial control points representing the original contour of the target image area, the coordinates of the initial control points are updated based on the fitting processing, at least a part of the initial control points which repeatedly contribute to the formation of the smooth contour in the initial control points are removed based on the updated coordinates to obtain expected control points, and then the smooth contour of the target image area can be determined based on the expected control points. By performing the fitting process and the process of removing the repetitive contributing control points in sequence, it is helpful to make the finally determined smoothed profile smoother compared to the original profile, while reducing the number of profile control points needed to determine such a smoother profile, thereby reducing the space or bandwidth needed to store or transmit such a smoother profile.
These and other aspects of the application will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
Further details, features and advantages of the present application are disclosed in the following description of exemplary embodiments, which is to be read in connection with the accompanying drawings, in which:
fig. 1 schematically shows an example scenario in which the technical solution provided by the present application may be applied;
FIG. 2 schematically illustrates an example flow diagram of an image processing method according to some embodiments of the present application;
FIG. 3 schematically illustrates another example flow diagram of an image processing method according to some embodiments of the present application;
FIG. 4 schematically illustrates an example original image, according to some embodiments of the present application;
FIG. 5 schematically illustrates an architecture diagram of an example semantic segmentation model according to some embodiments of the present application;
FIG. 6 schematically illustrates an example binarized mask map according to some embodiments of the present application;
FIG. 7 schematically illustrates an example interpolation process according to some embodiments of the present application;
FIG. 8 schematically illustrates an example fitting process according to some embodiments of the present application;
9A-9C schematically illustrate an example removal process according to some embodiments of the present application;
FIG. 10 schematically illustrates an example block diagram of an image processing apparatus according to some embodiments of this application;
FIG. 11 schematically illustrates an example block diagram of a computing device in accordance with some embodiments of the present application.
Detailed Description
Before describing embodiments of the present application in detail, some relevant concepts are explained first:
1. and (3) control points: control points are understood to be points controlling the shape of the contour, which may comprise all points on the contour, or may comprise only parts of the key points, e.g. points where the contour changes more clearly, such as turning positions, points of larger curvature positions, etc. The control points may be stored or transmitted in the form of coordinate points, such as two-dimensional coordinates, three-dimensional coordinates, and the like. For example, in a map, control points may be stored or transmitted in two-dimensional coordinates based on latitude and longitude or three-dimensional coordinates based on latitude and longitude and altitude.
2. Original profile/smoothed profile: in general, a contour may refer to an edge of a region or a boundary of the region with other regions. In this application, a contour may be understood as a contour image or contour data, i.e. an image comprising a contour of a certain area (such as an image of any suitable format) or data characterizing a contour of a certain area (e.g. a collection of data characterizing points or lines of the contour). Taking a smooth contour as an example, in the present application, a smooth contour may encompass both smooth contour data, which may include data related to the desired control points, such as coordinate data of the desired control points, and a smooth contour image, which may be generated by connecting the desired control points using a straight line or a curved line or processing the desired control points according to other predetermined mechanisms to obtain a contour characterized thereby, and optionally further rendering the resulting contour.
3. Artificial Intelligence (AI): a theory, method, technique and application system for simulating, extending and expanding human intelligence, sensing environment, acquiring knowledge and using knowledge to obtain optimal results by using a digital computer or a machine controlled by a digital computer. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making. The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like. Some steps in the image processing method provided by the present application may be implemented based on artificial intelligence, for example, processing an original image through a semantic segmentation model to obtain a corresponding binarized image, extracting an original contour in the binarized image through an edge recognition algorithm, and the like.
Embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 schematically shows an example scenario 100 where the technical solution provided by the present application may be applied. As shown in fig. 1, the technical solution provided by the present application may be implemented by means of a computing device 110. Alternatively, the computing device 110 may be any device with computing capabilities, such as a server, desktop computer, laptop computer, tablet computer, smart phone, etc., as well as any component of a device with computing capabilities, such as a processor, controller, programmable logic device, etc. Illustratively, the computing device 110 may obtain a plurality of initial control points, as indicated by reference numeral 120, characterizing the original contour of the target image area, which are processed according to the solution provided herein (to be described in detail below) to obtain a plurality of desired control points, as indicated by reference numeral 130. As shown, the number of desired control points obtained via the solution provided by the present application may be significantly lower than the number of initial control points, while the smooth contour based on the desired control points may be smoother in shape than the original contour based on the initial control points.
Illustratively, the computing device 110 may obtain the plurality of initial control points by receiving from an external device, receiving via user input, reading from local storage or removable external storage, or generating based on raw images, etc. The computing device 110 may store or transmit the resulting desired control point as smooth contour data or a portion of smooth contour data corresponding to the target image region. Alternatively, the computing device 110 may generate a corresponding smooth contour image in a particular format based on the resulting desired control points, and similarly, such smooth contour images may also be stored or transmitted.
The technical solutions provided in the present application may be implemented in the computing device 110 as a computer program or a computer program product, for example, as a program segment comprising computer instructions, an application program, a part of an application program, etc.
Optionally, the computing device 110 may provide a communication interface to receive data from and/or transmit data to an external device (e.g., a smooth profile determined based on a plurality of desired control points, indicated by reference numeral 130, etc.). For example, the computing device 110 may communicate with external devices via various wired or wireless communication means, such as via cable, fiber optics, WIFI, bluetooth, cellular networks, near field communications, and so forth.
Optionally, the computing device 110 may provide input and/or output interfaces for the user 140 to input raw images to be processed or data including initial control points, view and/or modify the generated smooth contours, and the like. It should be understood that references herein to a user (e.g., user 140) may encompass a variety of classes of users, such as end users, operation and maintenance personnel, test personnel, and any class of users of other devices, equipment, applications, program instructions, and the like.
Fig. 2 schematically illustrates an example flow diagram of an image processing method 200 according to some embodiments of the present application. Illustratively, the image processing method 200 may be applied to the scene 100 shown in fig. 1, for example, as performed by computer instructions comprised by a computer program or computer program product deployed in the computing device 110.
At step 210, a plurality of initial control points characterizing the original contour of the target image region may be obtained, wherein each of the plurality of initial control points may have coordinates. Illustratively, as previously described, the plurality of initial control points may be received externally, read from a storage device, or generated locally. The coordinates of the initial control point may be two-dimensional coordinates, three-dimensional coordinates, or more. For example, the coordinates of the initial control point may be coordinates in a cartesian coordinate system, such as a cartesian coordinate system established with a certain predefined reference point and a predefined reference direction in the target image area, a cartesian coordinate system established with a certain predefined reference point and a predefined reference direction in the original image including the target image area, a cartesian coordinate system established with latitude and longitude (and optionally altitude), and so on. In particular, the coordinates of the initial control point may be coordinates in a two-dimensional rectangular coordinate system.
In step 220, a plurality of initial control points may be fitted to obtain fitted coordinates of at least some of the plurality of initial control points, and coordinate updating may be performed on the plurality of initial control points by updating coordinates of the respective initial control points based on the fitted coordinates. Alternatively, the fitting in this step may be performed by selecting an appropriate manner according to actual needs, such as linear fitting, polynomial fitting, exponential fitting, gaussian fitting, and the like. According to the selected fitting manner, fitting parameters in the corresponding fitted curve equation may be determined based on part or all of the plurality of initial control points. For example, for a polynomial fit, the fitted curve equation can be expressed as:
Figure 69073DEST_PATH_IMAGE002
wherein the polynomial coefficient
Figure 986213DEST_PATH_IMAGE004
To fit the parameters, the coefficients may be determined based on some or all of the plurality of initial control points. According to the principle of least square method, when the number of given initial control points is larger than the number k of the fitting parameters, the corresponding fitting parameters can be solved. After the fitting parameters are determined, a fitting curve equation is determined, so that fitting coordinates of at least part of the initial control points in the plurality of initial control points can be obtained based on the fitting curve equation. The fitted coordinates are coordinates that satisfy the determined fitted curve equation. Subsequently, the coordinates of the corresponding initial control points may be updated based on the resulting fitted coordinates.
In some embodiments, the plurality of initial control points may be fitted by a least squares method to obtain a fitted curve, and fitted coordinates of at least some of the plurality of initial control points are determined based on the fitted curve. In other words, the fitting parameters in the fitted curve equation described above may be determined by the least squares method. The least squares method may find the best fit curve for the data points to be fitted by minimizing the sum of the squares of the errors, where the error refers to the distance of the data point being fitted to the fit curve. With the least squares method, the fitted curve can be solved easily and the error between the fitted curve and the fitted coordinates can be minimized, allowing a better fitting effect at a faster fitting speed. Further, the fitting process may be implemented by, for example, a cubic spline method, an RBF (Radial Basis Function) neural network, or the like.
In step 230, the initial control points that have repeated contributions to the formation of the smooth contour may be removed from the plurality of initial control points based on the coordinates of the plurality of initial control points after the coordinate update to obtain the desired control points. Alternatively, the initial control points having repeated contributions to the formation of the smooth contour among the plurality of initial control points may be removed based on various predetermined rules. In the present application, the contribution of an initial control point to the formation of a smooth contour may be understood as the influence of the initial control point on the smooth contour. Initial control points that have a repetitive contribution to the formation of a smooth contour may be understood as initial control points that have the same or similar effect on the smooth contour. For example, assuming that the degree of difference between a smooth contour in the presence of an initial control point and a smooth contour in the absence of the initial control point is below a threshold degree of difference, the initial control point may be considered to have repeated contributions to the formation of the smooth contour from several initial control points within its neighborhood or its neighborhood. The difference degree can be measured by the size of an angle formed by the original control point and two adjacent initial control points, the size of the area of a triangle formed by the original control point and the two adjacent initial control points, or the size of the distance between the original control point and a connecting line of the two points in the adjacent area of the original control point and the two adjacent initial control points.
At step 240, a smooth contour of the target image region may be determined based on the desired control points. As mentioned before, smooth contour data comprising the desired control points or further processed desired control points may be determined as a smooth contour of the target image area based on the obtained desired control points or after further processing of the desired control points; alternatively, a smooth contour image may be generated based on the obtained desired control points, or after further processing of these desired control points, a smooth contour image may be generated based on the further processed desired control points as a smooth contour of the target image region, which may be performed, for example, by connecting the desired control points using a straight line or a curved line, or other predetermined mechanism.
The image processing method 200 shown in fig. 2 helps to make the final generated smoothed profile smoother than the original profile, while reducing the number of profile control points needed to determine such a smoother profile, thereby reducing the space or bandwidth needed to store or transmit such a smoother profile.
FIG. 3 is a flow diagram schematically illustrating an example implementation, according to some embodiments, of the image processing method 200 shown in FIG. 2.
In some embodiments, as described with respect to fig. 2, the plurality of initial control points may be generated locally in step 210, which may be implemented, for example, by steps 211 and 212 shown in fig. 3.
Specifically, in step 211, a plurality of original control points characterizing an original contour of the target image area are obtained, each original control point having coordinates. The plurality of primitive control points may be received externally, read from a storage device, or generated locally. Similar to the description above regarding the coordinates of the initial control points, the coordinates of the original control points may be any suitable form of coordinates.
Illustratively, the plurality of raw control points may be automatically generated locally by an artificial intelligence based method. For example, a plurality of original control points may be automatically generated locally by the following process.
First, an original image including a target image area may be acquired. For example, the raw image may be received from an external device, received via user input, read from a local storage or a removable external storage, and so forth. The original image may be any suitable form of image acquired in any manner. Illustratively, the original image may be a satellite movie, and fig. 4 schematically shows an example satellite movie 400 that may be the original image. As shown in fig. 4, the satellite image 400 relates to a plurality of different types of target areas, such as an aqueous area 410, a road area 420, a building area 430, a green space area 440, and the like. Exemplarily, hereinafter, a process of acquiring a plurality of original control points according to some embodiments of the present application is described by taking the water system area 410 included in the satellite imagery 400 as an example.
The original image may then be semantically segmented to generate a binarized mask map comprising the target image area. Semantic segmentation may convert the original image into a mask with highlighted regions of interest and facilitate the partitioning of different types of regions of interest into different mask regions. Illustratively, semantic segmentation may be implemented by FCN (full convolution network), UNet, SegNet, PSPNet, Deeplab, ICNet, etc. models. Further, illustratively, the binarized mask map including the target image area may also be generated by performing instance division, panorama division, or the like on the original image.
Exemplarily, fig. 5 schematically shows an architecture diagram of the UNet model 500. The input image of the model 500 may be an original image or a pre-segmented portion of an original image, such as the satellite imagery 400 shown in FIG. 4. The output segmentation map of the model 500 may be the resulting binarized mask map, where, for example, a portion of the target image area may be assigned a value of 1 and the remaining portion may be assigned a value of 0. As shown in fig. 5, the overall architecture of the UNet model 500 used according to some embodiments of the present application is "U" shaped with a feature extraction path 510 on one side and an upsampling path 520 on the other side. Rectangles, similar to those indicated by reference numeral 530, represent feature maps, each feature icon being labeled with a channel number 531 and a size 532. In the feature extraction path 510, the right arrow represents the 3 × 3 convolutional layer and activation layer (e.g., ReLU) for feature extraction; the downward arrow indicates a pooling layer for reducing dimensionality. In the upsampling path 520, the right arrow (in addition to the right arrow associated with the output layer 540) also represents the 3 × 3 convolutional layer and the active layer (e.g., ReLU) for feature extraction; the up arrow represents upsampling, e.g., a 2 x2 upsampled convolutional layer, such as a transposed convolutional layer, for recovering the dimensionality. The right arrow associated with the output layer represents a 1 × 1 convolutional layer to achieve classification, and a binarized mask map is finally generated based on this classification. The right arrow between the feature extraction branch 510 and the upsampling branch 520 indicates copying and clipping, and the part of the feature map with the same number of channels in the feature extraction branch 510 is spliced to the corresponding feature map in the upsampling branch 520 for feature fusion, so that more accurate context information can be obtained, and a better segmentation effect can be achieved.
Fig. 6 schematically shows an exemplary binary mask map 600 obtained according to the above-described process. As shown in fig. 6, the binarized mask map 600 includes two regions 610 and 620. Corresponding to the satellite image 400 shown in fig. 4, the region 610 in fig. 6 corresponds to the river region 410 in fig. 4, and the region 620 in fig. 6 corresponds to the region other than the river region 410 in fig. 4. Illustratively, in the binarized mask map 600, the area 610 may be assigned 1 and the area 620 may be assigned 0. The generation of the binarized mask map may facilitate the extraction of the original control points on the original contour.
Finally, a plurality of original control points on the original contour of the target image area may be extracted from the binarized mask map. Alternatively, the extracted plurality of original control points may include all points on the original contour of the target image region, or only some of the points, for example, points such as the turning point positions, the positions with large curvature, and the like. Alternatively, any suitable method may be used to extract the plurality of original control points on the original contour of the target image region, such as, but not limited to, an edge-tracking algorithm, a delete interior point algorithm, or the use of various edge detection operators (such as Roberts operator, Sobel operator, Prewitt operator, Kirsch operator, Robinson operator, Laplacian operator, Canny operator, LOG operator, etc.), and the like.
At step 212, a plurality of original control points are interpolated based on the coordinates of the original control points to obtain a plurality of initial control points. For example, the distance between each pair of adjacent original control points may be sequentially calculated based on the coordinates of the original control points, and when the distance is greater than a preset threshold, at least one interpolated control point may be inserted between the adjacent original control points, for example, so that the distance between the adjacent control points (including the original control points and the interpolated control points) is at least not greater than the preset threshold. The distance may be a distance predefined according to actual needs, for example, a euclidean distance between adjacent original control points, a distance in a certain dimension direction, and the like. Alternatively, such at least one interpolated control point may be inserted equally spaced between adjacent original control points, or may be inserted according to other predetermined mechanisms. The preset threshold of the distance may be preset according to actual requirements, or may be adaptively determined. In addition, it should be understood that, in the description of the present application, when two objects are described as being adjacent, "adjacent" means that there are no other objects of the same type between the two objects, for example, an adjacent original control point means that there are no other original control points between two original control points, and an adjacent control point means that there are no other control points between two control points (here, a control point includes any control point such as an original control point, an inserted control point, etc.). The interpolation of the original control points is beneficial to obtaining a smoother fitting curve which can reflect the change trend of the profile better in the subsequent fitting step, so that the obtained smooth profile is smoother compared with the original profile, and higher accuracy is guaranteed.
In some embodiments, as described above, the coordinates of the original control point may have at least two dimensions, i.e. at least comprise coordinate values in two dimensions. In this case, the interpolation operation can be done by sequentially interpolating in two dimensional directions.
Specifically, in the direction of a first dimension of the two dimensions, when the distance between adjacent original control points is greater than a first threshold distance, at least one first interpolation control point is inserted between the adjacent original control points at equal intervals, so that the intervals between the adjacent original control points and any two adjacent control points among the first interpolation control points inserted therebetween are all equal. The plurality of original control points and the interpolated first interpolated control point may be considered as a plurality of transitional control points. Then, when a distance between adjacent transition control points of the plurality of transition control points is greater than a second threshold distance in a direction of a second dimension of the two dimensions, at least one second interpolation control point is inserted between the adjacent transition control points at an equal interval such that intervals between the adjacent transition control points and any two adjacent control points of the second interpolation control points inserted therebetween are all equal. Finally, the plurality of transition control points and the interpolated second interpolated control point may be determined as a plurality of initial control points. The first dimension and the second dimension may be two dimensions of two-dimensional coordinates, or two dimensions of three-dimensional coordinates or more. For example, in an application involving a map, the first dimension and the second dimension may be dimensions defined by or set in relation to latitude and longitude. For example, the first threshold distance and the second threshold distance may be set to be the same or different according to actual needs.
Fig. 7 schematically illustrates an example interpolation process in one dimension as an example, according to some embodiments of the present application. As shown in fig. 7, assuming that the point 710 and the point 720 are adjacent original control points, each point has two-dimensional coordinates and includes coordinate values in the x-dimension and coordinate values in the y-dimension, for example, the coordinates of the point 710 are (x 1, y 1) and the coordinates of the point 720 are (x 2, y 2). Illustratively, in the direction of the x-dimension, the distance between point 710 and point 720 may be calculated. As shown, point 710 and point 720 are separated by Δ x in the direction of the x-dimension, e.g., the difference between x1 and x 2. It may be determined whether the distance Δ x between the point 710 and the point 720 is greater than a preset first threshold distance. If so, at least one first interpolation control point may be inserted equally spaced between the two points, for example, so that the distance of adjacent control points (including the original control point and the first interpolation control point) in the direction of the x-dimension is at least not greater than a first threshold distance; if not, the first interpolation control point is not inserted between the two points. For example, if Δ x is 8 and the first threshold distance is 2, then 3 first interpolation control points 730 may be inserted equally spaced between points 710 and 720, where the spacing is 2 in the direction of the x-dimension. For example, a straight line L may be fitted according to the coordinates of the point 710 and the point 720, the coordinate value in the x-dimension of the inserted first interpolation control point (the point closest to the point 710 in fig. 7) may be x1+2, and then the coordinate value may be substituted into the equation of the straight line L to obtain a corresponding coordinate value in the y-dimension, and the combination of the coordinate value in the x-dimension and the coordinate value in the y-dimension is the coordinate of the first interpolation control point. By analogy, the coordinates of all the first interpolation control points can be obtained, and the interpolation in the direction of the x dimension is completed. The interpolated first interpolated control point may constitute a transitional control point together with the original control point. Subsequently, in the direction of the y dimension, the interpolation process described above can be similarly performed for the transition control points to finally complete the interpolation process.
In some embodiments, as described with respect to fig. 2, in step 220, a plurality of initial control points may be fitted in various ways, illustratively, this process may be implemented by step 221 shown in fig. 3.
In step 221, initial control points within the predetermined fitting window may be fitted while sliding the predetermined fitting window by a predetermined step size to obtain fitted coordinates of at least one initial control point in the fitting window, and coordinates of a corresponding initial control point in the at least one initial control point may be updated based on the fitted coordinates. The size of the fitting window and the size of the predetermined step size may be preset according to actual needs, or may be automatically and/or adaptively determined. Alternatively, the step size may be determined in relation to the size of the fitting window to achieve a better overall fitting effect for all initial control points. For example, initial control points within the predetermined fitting window may be fitted while sliding the predetermined fitting window by a predetermined step size to obtain fitted coordinates of a central initial control point in the fitting window, and in this case, the size of the fitting window may optionally be determined to cover an odd number of initial control points to ensure that the central initial control point is present. Alternatively, the fitted coordinates of other initial control points in the fitted window may alternatively or additionally be derived. Optionally, the fitting coordinates of which initial control point or points in the fitting window to get to can also be determined in association with the step size and the size of the fitting window to get a better overall fitting effect. Compared with the process of directly fitting all the initial control points, the fitting process based on the sliding fitting window described in step 221 can reduce the amount of calculation, reduce the calculation complexity, improve the fitting efficiency, and ensure higher fitting accuracy.
In some embodiments, as described above, the coordinates of the initial control point may include a first coordinate value in the first dimension direction. In this case, step 221 may include: and fitting the first preset fitting window based on the serial number of the initial control point and the first coordinate value of the initial control point in the first preset fitting window while sliding the first preset fitting window by a first preset step length to obtain a fitted first coordinate value of at least one initial control point, and updating the first coordinate value of the corresponding initial control point based on the obtained fitted first coordinate value to update the coordinate of the corresponding initial control point.
Exemplarily, fig. 8 schematically illustrates the above-described fitting procedure for the first dimension direction, where the first dimension direction is assumed to be the direction of the x dimension, the fitting window in the figure represents a first predetermined fitting window, and the fitting window has a size of 5 (i.e. covers 5 initial control points). As shown in fig. 8, the fitting window may be slid by a predetermined step size, which may be a step size of 1, 2, 3 or more, and this is not particularly limited in this application. Each time the fitting window is slid once by a predetermined step size, the initial control points in the fitting window may be fitted and the coordinates of part of the initial control points are updated. Specifically, within the fitting window shown in fig. 8, fitting may be performed based on the serial number of the initial control points and the coordinate values of the x-dimension of the initial control points to obtain the coordinate values of the fitted x-dimension of at least one initial control point (e.g., the initial control point at the center of the fitting window) in the fitting window. Specifically, as described with respect to fig. 2, the initial control points in the fitting window may be polynomial fitted by a least squares method, or other fitting manners may also be taken. After obtaining the fitted curve equation, the serial number of the at least one initial control point within the fitting window may be substituted into the obtained curve equation to obtain coordinate values of the fitted x-dimension of the at least one initial control point, which may be a central initial control point within the fitting window, or alternatively or additionally may include other initial control points. After the coordinate value of the fitted x dimension of at least one initial control point in the fitting window is obtained, the coordinate value of the fitted x dimension can be used for replacing the coordinate value of the current x dimension of the corresponding initial control point, so that the coordinate of the corresponding initial control point is updated. Then, the fitting window may continue to be slid, and the initial control points within the sliding window continue to be fitted according to the above process until the fitting window slides to the last initial control point.
In some embodiments, as described above, the coordinates of the initial control point may further include a second coordinate value in the second dimension direction. In this case, step 221 may further include: and fitting the sequence number of the initial control point and a second coordinate value of the initial control point in the second preset fitting window to obtain a fitted second coordinate value of at least one initial control point while sliding the second preset fitting window by a second preset step length, and updating the second coordinate value of the corresponding initial control point based on the obtained fitted second coordinate value to update the coordinate of the corresponding initial control point.
For example, the fitting process for the second dimension direction may be performed similarly to the process shown in fig. 8, and is not described herein again. Optionally, the second predetermined step size may be the same as or different from the first predetermined step size, and the size of the second predetermined fitting window may be the same as or different from the size of the first predetermined fitting window, according to actual needs.
In some embodiments, as described with respect to fig. 2, in step 230, an initial control point of the plurality of initial control points that repeatedly contributes to the formation of the smooth contour may be removed based on various predetermined rules, and the process may be implemented, for example, by step 231 shown in fig. 3.
At step 231, for a plurality of initial control points, at least one round of removal process is performed, and in each round of removal process, the initial control points having repeated contributions are removed while sliding a predefined removal window at a predefined removal step, based on a predefined rule. Illustratively, the predetermined rule may relate to one of an angle-based, area-based, point-to-line distance-based rule. The size of the removal step length and the removal window can be set according to actual needs, which is not limited in the present application. Alternatively, the removal step size may be a setting associated with the size of the removal window.
Specifically, the angle-based rule may include: and when the angle formed by any three continuous unremoved initial control points in the removal window is larger than the threshold angle, removing the middle initial control point in the three initial control points as the initial control point with the repeated contribution. Illustratively, FIG. 9A schematically illustrates a process 900A for removing according to an angle-based rule. As shown in fig. 9A, assuming that the size of the removal window is 5, that is, 5 initial control points are covered, the 5 initial control points may be continuous initial control points without any removal processing, or continuous initial control points without removal after one or more removal processing. The removal window may be slid along the initial control point by a predetermined removal step. According to actual needs, the removal step size can be 1, 2, 3 or more, and the threshold angle can be preset according to actual needs. For A, B, C, D, E initial control points in the current removal window, the angles of the three angles they make up, namely angle ABC, angle BCD, and angle CDE, can be calculated. And if a certain angle is larger than the threshold angle, removing the middle initial control point in the three initial control points corresponding to the angle. Illustratively, if angle ABC is greater than a threshold angle, then initial control point B is removed. Further, if angle CDE is also greater than the threshold angle, then initial control point D is also removed. While for angles BCD not greater than the threshold angle, the initial control point C is retained. After the removal operation of the initial control point in the current window is completed, the window can be continuously removed by sliding. Assuming that the removal step size is 1, and the initial control points B and D in the current removal window are removed, the removal window may slide 1 initial control point, that is, slide starting from the initial control point a to the initial control point C.
The area-based rules may include: and when the area of a triangle formed by any three continuous unremoved initial control points in the removal window is smaller than the threshold area, removing the middle initial control point in the three initial control points as the initial control point with the repeated contribution. As shown in fig. 9B, it is also assumed that the size of the removal window is 5, that is, 5 initial control points are covered, and the 5 initial control points may be continuous initial control points without any removal processing, or continuous initial control points without removal after one or more removal processing. The removal window may be slid along the initial control point by a predetermined removal step. According to actual needs, the removal step size can be 1, 2, 3 or more, and the threshold area can be preset according to actual needs. For A, B, C, D, E initial control points in the current removal window, the areas of the three triangles they form, namely triangle ABC, triangle BCD, and triangle CDE, can be calculated. And if the area of one triangle is smaller than the threshold area, removing the middle initial control point from the three initial control points corresponding to the triangle. Illustratively, if triangle ABC is less than the threshold area, then the initial control point B is removed. Further, if the triangle CDE is also smaller than the threshold area, the initial control point D is also removed. And for triangle BCD not greater than the threshold area, the initial control point C is retained. After the removal operation of the initial control point in the current window is completed, the window can be continuously removed by sliding. Assuming that the removal step size is 1, and the initial control points B and D in the current removal window are removed, the removal window may slide 1 initial control point, that is, slide starting from the initial control point a to the initial control point C.
The rules based on dot-line spacing may include: and when the maximum point line distance of the connecting line of each unremoved initial control point and the head-tail initial control point of the screening window in the removing window is smaller than the threshold point line distance, removing the initial control point corresponding to the maximum point line distance as the initial control point with repeated contribution. As shown in fig. 9C, it is also assumed that the size of the removal window is 5, that is, 5 initial control points are covered, and the 5 initial control points may be continuous initial control points without any removal processing, or continuous initial control points without removal after one or more removal processing. The removal window may be slid along the initial control point by a predetermined removal step. According to actual needs, the removal step size can be 1, 2, 3 or more, and the threshold point-to-line distance can be preset according to actual needs. For A, B, C, D, E initial control points in the current removal window, the point-to-line distance of point B, C, D from the straight line determined by point A, E may be calculated. If the maximum point-line distance is smaller than the threshold point-line distance, the initial control point corresponding to the maximum point-line distance is removed and the other two initial control points are reserved, so that the number of the control points is reduced, and the contour shape characteristics are not excessively lost. Illustratively, as shown in fig. 9C, the dot-line distance from B to line AE is the maximum dot-line distance in the current removal window, and if the dot-line distance is smaller than the threshold dot-line distance, the initial control point B is removed and the initial control point C, D remains; if the dot pitch is not less than the threshold dot pitch, the initial control point B, C, D is retained. After the removal operation of the initial control point in the current window is completed, the window can be continuously removed by sliding. Assuming that the removal step size is 1 and the initial control point B in the current removal window is removed, the removal window may be slid by 1 initial control point, that is, from the initial control point a as the starting point to the initial control point C as the starting point.
Through at least one round of the removal process in step 231, at least a portion of the initial control points that repeatedly contribute to the formation of the smooth contour may be removed, and the balance between the degree of smoothness, the degree of accuracy, and the number of control points of the smooth contour may be facilitated.
In some embodiments, as described with respect to fig. 2, in step 240, a smooth contour may be determined based on the further processed initial control points after further processing of the desired control points. For example, the process may involve allowing further processing of the desired control points based on active correction operations to cause the corrected desired control points to more closely follow the original contour of the target image region, thereby improving the accuracy of the determined smooth contour of the target image region. This may be achieved, for example, by steps 241-243 shown in FIG. 3.
At step 241, a corrective action for the desired control point may be received. Illustratively, at least one of the following corrective actions may be received: adding at least one new desired control point, removing at least one desired control point, and adjusting coordinates of at least one desired control point. Illustratively, the correction operation may be received via a user interface (e.g., a user interface). For example, the image including the target image area and the resulting desired control point may be presented to the user via a user interface such that the user may perform a corrective operation by adding, removing, or adjusting the position of the desired control point.
In step 242, the desired control point may be modified based on the correction operation. Illustratively, at least one new desired control point may be added, at least one desired control point may be removed, and coordinates of at least one desired control point may be adjusted based on the received corrective action.
At step 243, a smooth contour of the target image region may be determined based on the modified desired control points. This determination step may be similarly performed in the manner described with respect to step 240 of fig. 2.
Fig. 10 schematically illustrates an example block diagram of an image processing apparatus 1000 according to some embodiments of the present application. As shown in fig. 10, the image processing apparatus 1000 may include an acquisition module 1010, a fitting module 1020, a removal module 1030, and a determination module 1040.
In particular, the obtaining module 1010 may be configured to obtain a plurality of initial control points characterizing an original contour of the target image region, each of the plurality of initial control points having coordinates; the fitting module 1020 may be configured to fit the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and perform coordinate updating on the plurality of initial control points by updating coordinates of respective initial control points based on the fitted coordinates; the removal module 1030 may be configured to remove initial control points from the plurality of initial control points that repeatedly contribute to the formation of the smooth contour based on the coordinates of the plurality of initial control points after the coordinate update to obtain desired control points; the determination module 1040 may be configured to determine a smooth contour of the target image region based on the desired control points.
The image processing apparatus 1000 may be deployed on the computing device 110 shown in fig. 1. It should be understood that the image processing apparatus 1000 may be implemented in software, hardware, or a combination of software and hardware. Several different modules may be implemented in the same software or hardware configuration, or one module may be implemented by several different software or hardware configurations.
Furthermore, the image processing apparatus 1000 may be used for implementing the image processing method described above, and the relevant details thereof have been described in detail in the foregoing, and are not repeated here for the sake of brevity. The image processing apparatus 1000 may have the same features and advantages as described with respect to the aforementioned image processing method.
Fig. 11 schematically illustrates an example block diagram of a computing device 1100, such as may represent computing device 110 in fig. 1, in accordance with some embodiments of the present application.
As shown, the example computing device 1100 includes a processing system 1101, one or more computer-readable media 1102, and one or more I/O interfaces 1103 communicatively coupled to each other. Although not shown, the electronic device 1100 may also include a system bus or other data and command transfer system that couples the various components to one another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures, or that also includes data lines, such as control and data lines.
The processing system 1101 represents functionality to perform one or more operations using hardware. Accordingly, the processing system 1101 is illustrated as including hardware elements 1104 that may be configured as processors, functional blocks, and so forth. This may include implementing an application specific integrated circuit or other logic device formed using one or more semiconductors in hardware. Hardware element 1104 is not limited by the materials from which it is formed or the processing mechanisms employed therein. For example, a processor may be comprised of semiconductor(s) and/or transistors (e.g., electronic Integrated Circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.
The computer-readable medium 1102 is illustrated as including a memory/storage 1105. Memory/storage 1105 represents memory/storage associated with one or more computer-readable media. Memory/storage 1105 may include volatile storage media (such as Random Access Memory (RAM)) and/or nonvolatile storage media (such as Read Only Memory (ROM), flash memory, optical disks, magnetic disks, and so forth). Memory/storage 1105 may include fixed media (e.g., RAM, ROM, a fixed hard drive, etc.) as well as removable media (e.g., flash memory, a removable hard drive, an optical disk, and so forth). Illustratively, the memory/storage 1105 may be used to store the various coordinates or image data, etc. mentioned in the embodiments above. The computer-readable medium 1102 may be configured in various other ways, which are further described below.
One or more input/output interfaces 1103 represent functionality that allows a user to enter commands and information to electronic device 1100, and that also allows information to be presented to the user and/or sent to other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice input), a scanner, touch functionality (e.g., capacitive or other sensors configured to detect physical touch), a camera (e.g., motion that does not involve touch may be detected as gestures using visible or invisible wavelengths such as infrared frequencies), a network card, a receiver, and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a haptic response device, a network card, a transmitter, and so forth. For example, in the above-described embodiments, user input, an original image or an initial control point, etc. may be received through an input device, a desired control point, etc. may be presented through an output device.
The electronic device 1100 also includes an image processing policy 1106. The image processing policy 1106 may be stored as computer program instructions in the memory/storage 1105. The image processing policy 1106 may implement all functions of the respective modules of the image processing apparatus 1000 described with respect to fig. 10 together with the processing system 1101 and the like.
Various techniques may be described herein in the general context of software, hardware, elements, or program modules. Generally, these modules include routines, programs, objects, elements, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The terms "module," "functionality," and the like, as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of computing platforms having a variety of processors.
An implementation of the described modules and techniques may be stored on or transmitted across some form of computer readable media. Computer readable media can include a variety of media that can be accessed by electronic device 1100. By way of example, and not limitation, computer-readable media may comprise "computer-readable storage media" and "computer-readable signal media".
"computer-readable storage medium" refers to a medium and/or device, and/or a tangible storage apparatus, capable of persistently storing information, as opposed to mere signal transmission, carrier wave, or signal per se. Accordingly, computer-readable storage media refers to non-signal bearing media. Computer-readable storage media include hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer-readable instructions, data structures, program modules, logic elements/circuits or other data. Examples of computer readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage devices, tangible media, or an article of manufacture suitable for storing the desired information and accessible by a computer.
"computer-readable signal medium" refers to a signal-bearing medium configured to transmit instructions to the hardware of electronic device 1100, such as via a network. Signal media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave, data signal or other transport mechanism. Signal media also includes any information delivery media. By way of example, and not limitation, signal media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
As previously described, hardware element 1101 and computer-readable medium 1102 represent instructions, modules, programmable device logic, and/or fixed device logic implemented in hardware form that may be used in some embodiments to implement at least some aspects of the techniques described herein. The hardware elements may include integrated circuits or systems-on-chips, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Complex Programmable Logic Devices (CPLDs), and other implementations in silicon or components of other hardware devices. In this context, a hardware element may serve as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element, as well as a hardware device for storing instructions for execution, such as the computer-readable storage medium described previously.
Combinations of the foregoing may also be used to implement the various techniques and modules described herein. Thus, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage medium and/or by one or more hardware elements 1101. The electronic device 1100 may be configured to implement particular instructions and/or functions corresponding to software and/or hardware modules. Thus, implementing a module as a module executable by the electronic device 1100 as software may be implemented at least partially in hardware, for example, using a computer-readable storage medium of a processing system and/or the hardware elements 1101. The instructions and/or functions may be executed/operable by, for example, one or more of the electronic device 1100 and/or the processing system 1101 to implement the techniques, modules, and examples described herein.
The techniques described herein may be supported by these various configurations of the electronic device 1100 and are not limited to specific examples of the techniques described herein.
It will be appreciated that embodiments of the disclosure have been described with reference to different functional units for clarity. However, it will be apparent that the functionality of each functional unit may be implemented in a single unit, in a plurality of units or as part of other functional units without departing from the disclosure. For example, functionality illustrated to be performed by a single unit may be performed by a plurality of different units. Thus, references to specific functional units are only to be seen as references to suitable units for providing the described functionality rather than indicative of a strict logical or physical structure or organization. Thus, the present disclosure may be implemented in a single unit or may be physically and functionally distributed between different units and circuits.
The present application provides a computer-readable storage medium having computer-readable instructions stored thereon which, when executed, implement the information targeting method described above.
A computer program product or computer program is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computing device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computing device to perform the information orientation method provided in the various alternative implementations described above.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (15)

1. An image processing method comprising:
acquiring a plurality of initial control points representing the original contour of a target image area, wherein each initial control point in the plurality of initial control points has a coordinate;
fitting the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and performing coordinate updating on the plurality of initial control points by updating coordinates of the respective initial control points based on the fitted coordinates;
based on the coordinates of the plurality of initial control points after the coordinate updating, removing the initial control points which have repeated contribution to the formation of the smooth contour from the plurality of initial control points to obtain the expected control points;
based on the desired control point, a smooth contour of the target image region is determined.
2. The method of claim 1, wherein said fitting the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points and performing coordinate updates on the plurality of initial control points by updating coordinates of respective initial control points based on the fitted coordinates comprises:
fitting the initial control points in the preset fitting window while sliding the preset fitting window by preset step length to obtain the fitting coordinate of at least one initial control point in the fitting window, and updating the coordinate of the corresponding initial control point in the at least one initial control point based on the fitting coordinate.
3. The method of claim 2, wherein the coordinates comprise a first coordinate value in a first dimension direction, and
wherein the fitting initial control points in the predetermined fitting window while sliding the predetermined fitting window by a predetermined step length to obtain a fitting coordinate of at least one initial control point in the fitting window, and updating the coordinate of a corresponding initial control point in the at least one initial control point based on the fitting coordinate comprises:
and fitting based on the serial number of the initial control point and a first coordinate value of the initial control point in the first preset fitting window to obtain a fitted first coordinate value of at least one initial control point, and updating the first coordinate value of the corresponding initial control point based on the obtained fitted first coordinate value to update the coordinate of the corresponding initial control point.
4. The method of claim 3, wherein the coordinates further comprise a second coordinate value in a second dimensional direction, an
Wherein the fitting initial control points in the predetermined fitting window while sliding the predetermined fitting window by a predetermined step length to obtain a fitting coordinate of at least one initial control point in the fitting window, and updating the coordinate of a corresponding initial control point in the at least one initial control point based on the fitting coordinate further comprises:
and fitting the initial control points and second coordinate values of the initial control points in the second preset fitting window based on the serial numbers of the initial control points to obtain fitted second coordinate values of at least one initial control point while sliding the second preset fitting window by a second preset step length, and updating the second coordinate values of the corresponding initial control points based on the obtained fitted second coordinate values to update the coordinates of the corresponding initial control points.
5. The method of claim 2, wherein fitting the initial control points within a predetermined fitting window while sliding the predetermined fitting window by a predetermined step size to obtain fitted coordinates of at least one initial control point in the fitting window comprises:
and fitting the initial control points in the preset fitting window while sliding the preset fitting window by preset step length to obtain the fitting coordinates of the central initial control point in the fitting window.
6. The method of claim 1, wherein said fitting the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points comprises:
fitting the plurality of initial control points by a least squares method to obtain a fitted curve, and determining fitted coordinates of at least some of the plurality of initial control points based on the fitted curve.
7. The method of any of claims 1-6, wherein said obtaining a plurality of initial control points characterizing an original contour of a target image region comprises:
acquiring a plurality of original control points representing original contours of a target image area, wherein each original control point has a coordinate;
and interpolating the plurality of original control points based on the coordinates of the original control points to obtain the plurality of initial control points.
8. The method of claim 7, wherein the coordinates have at least two dimensions, and interpolating the plurality of original control points based on the coordinates of the original control points to obtain the plurality of original control points comprises:
in the direction of a first dimension in two dimensions, when the distance between adjacent original control points is greater than a first threshold distance, inserting at least one first interpolation control point between the adjacent original control points at equal intervals;
in the direction of a second dimension of the two dimensions, when the distance between adjacent transition control points in a plurality of transition control points is greater than a second threshold distance, inserting at least one second interpolation control point between the adjacent transition control points at equal intervals, wherein the plurality of transition control points comprise the plurality of original adjacent control points and the first interpolation control point;
determining the plurality of transition control points and the second interpolated control point as the plurality of initial control points.
9. The method of claim 7, wherein said obtaining original control points characterizing an original contour of a target region comprises:
acquiring an original image comprising the target image area;
performing semantic segmentation on the original image to generate a binary mask map comprising the target image area;
and extracting a plurality of original control points on the original contour of the target image area from the binary mask image.
10. The method of claim 1, wherein said removing initial control points from said plurality of initial control points that have repeated contributions to the formation of a smooth contour comprises:
for the plurality of initial control points, performing at least one round of removal process, and in each round of removal process, removing initial control points having repeated contributions while sliding a predefined removal window by a predetermined removal step, based on one of:
when the angle formed by any three continuous unremoved initial control points in the removal window is larger than the threshold angle, removing the middle initial control point in the three initial control points as an initial control point with repeated contribution;
when the area of a triangle formed by any three continuous unremoved initial control points in the removal window is smaller than the threshold area, removing the middle initial control point in the three initial control points as an initial control point with repeated contribution;
and when the maximum point line distance of the connecting line of each unremoved initial control point and the head-tail initial control point of the screening window in the removing window is smaller than the threshold point line distance, removing the initial control point corresponding to the maximum point line distance as the initial control point with repeated contribution.
11. The method of claim 1, wherein said determining a smooth contour of the target image region based on the desired control point comprises:
receiving a corrective action for the desired control point;
correcting the desired control point based on the correcting operation;
determining a smooth contour of the target image region based on the modified desired control points.
12. The method of claim 11, wherein the receiving a corrective action for the desired control point comprises:
receiving at least one of the following corrective actions: adding at least one new desired control point, removing at least one desired control point, and adjusting coordinates of at least one desired control point.
13. An image processing apparatus comprising:
an acquisition module configured to acquire a plurality of initial control points characterizing an original contour of a target image region, each of the plurality of initial control points having coordinates;
a fitting module configured to fit the plurality of initial control points to obtain fitted coordinates of at least some of the plurality of initial control points, and perform coordinate updating on the plurality of initial control points by updating coordinates of respective initial control points based on the fitted coordinates;
a removal module configured to remove initial control points having repeated contribution to the formation of the smooth contour from the plurality of initial control points based on the coordinates of the plurality of initial control points after the coordinate update to obtain desired control points;
a determination module configured to determine a smooth contour of the target image region based on the desired control point.
14. A computing device comprising
A memory configured to store computer-executable instructions;
a processor configured to perform the method of any one of claims 1-12 when the computer-executable instructions are executed by the processor.
15. A computer-readable storage medium storing computer-executable instructions that, when executed, perform the method of any one of claims 1-12.
CN202110061765.6A 2021-01-18 2021-01-18 Image processing method and device, computing equipment and storage medium Pending CN112750139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110061765.6A CN112750139A (en) 2021-01-18 2021-01-18 Image processing method and device, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110061765.6A CN112750139A (en) 2021-01-18 2021-01-18 Image processing method and device, computing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112750139A true CN112750139A (en) 2021-05-04

Family

ID=75652214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110061765.6A Pending CN112750139A (en) 2021-01-18 2021-01-18 Image processing method and device, computing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112750139A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221740A (en) * 2021-05-12 2021-08-06 浙江大学 Farmland boundary identification method and system
CN114217721A (en) * 2021-11-03 2022-03-22 湖南新云网科技有限公司 Image display method, device, equipment and storage medium
EP4310797A1 (en) * 2022-07-21 2024-01-24 Aptiv Technologies Limited Method for determination of a free space boundary of a physical environment in a vehicle assistance system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894370A (en) * 2010-07-14 2010-11-24 苏州大学 Automatic generation method of shape parameter-adaptive oracle-bone inscription contour glyphs
CN102779340A (en) * 2012-06-12 2012-11-14 华中科技大学 Automatic corresponding method of feature point coordinates based on Delaunay triangulation
CN111898593A (en) * 2020-09-29 2020-11-06 湖南新云网科技有限公司 Geometric figure shape recognition method, device, equipment and storage medium
CN112101105A (en) * 2020-08-07 2020-12-18 深圳数联天下智能科技有限公司 Training method and device for face key point detection model and storage medium
CN112215952A (en) * 2020-10-26 2021-01-12 湖北亿咖通科技有限公司 Curve drawing method, computer storage medium and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894370A (en) * 2010-07-14 2010-11-24 苏州大学 Automatic generation method of shape parameter-adaptive oracle-bone inscription contour glyphs
CN102779340A (en) * 2012-06-12 2012-11-14 华中科技大学 Automatic corresponding method of feature point coordinates based on Delaunay triangulation
CN112101105A (en) * 2020-08-07 2020-12-18 深圳数联天下智能科技有限公司 Training method and device for face key point detection model and storage medium
CN111898593A (en) * 2020-09-29 2020-11-06 湖南新云网科技有限公司 Geometric figure shape recognition method, device, equipment and storage medium
CN112215952A (en) * 2020-10-26 2021-01-12 湖北亿咖通科技有限公司 Curve drawing method, computer storage medium and electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘晶: "一种图像轮廓数据的控制点检测算法", 《机械科学与技术》 *
陈天富: "图像轮廓的Bézier曲线拟合研究", 《中国优秀博硕士学位论文全文数据库(硕士)》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221740A (en) * 2021-05-12 2021-08-06 浙江大学 Farmland boundary identification method and system
CN114217721A (en) * 2021-11-03 2022-03-22 湖南新云网科技有限公司 Image display method, device, equipment and storage medium
EP4310797A1 (en) * 2022-07-21 2024-01-24 Aptiv Technologies Limited Method for determination of a free space boundary of a physical environment in a vehicle assistance system
EP4310798A1 (en) * 2022-07-21 2024-01-24 Aptiv Technologies Limited Method for determination of a free space boundary of a physical environment in a vehicle assistance system

Similar Documents

Publication Publication Date Title
JP7236545B2 (en) Video target tracking method and apparatus, computer apparatus, program
US10692221B2 (en) Automatic trimap generation and image segmentation
CN112750139A (en) Image processing method and device, computing equipment and storage medium
CN108229455B (en) Object detection method, neural network training method and device and electronic equipment
CN112991447B (en) Visual positioning and static map construction method and system in dynamic environment
CN108681743B (en) Image object recognition method and device and storage medium
CN109683699B (en) Method and device for realizing augmented reality based on deep learning and mobile terminal
CN111583097A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110287964B (en) Stereo matching method and device
CN112529015A (en) Three-dimensional point cloud processing method, device and equipment based on geometric unwrapping
CN110852349A (en) Image processing method, detection method, related equipment and storage medium
CN112308866B (en) Image processing method, device, electronic equipment and storage medium
CN115082885A (en) Point cloud target detection method, device, equipment and storage medium
CN113313810A (en) 6D attitude parameter calculation method for transparent object
CN110827301B (en) Method and apparatus for processing image
KR102628115B1 (en) Image processing method, device, storage medium, and electronic device
CN113112542A (en) Visual positioning method and device, electronic equipment and storage medium
CN111797711A (en) Model training method and device
CN111353957A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113592015B (en) Method and device for positioning and training feature matching network
CN114283343A (en) Map updating method, training method and equipment based on remote sensing satellite image
CN110827341A (en) Picture depth estimation method and device and storage medium
CN113158856A (en) Processing method and device for extracting target area in remote sensing image
CN116580407A (en) Training method of text detection model, text detection method and device
KR20230083212A (en) Apparatus and method for estimating object posture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40043965

Country of ref document: HK