CN115908632A - Image processing method, device, equipment and readable medium - Google Patents

Image processing method, device, equipment and readable medium Download PDF

Info

Publication number
CN115908632A
CN115908632A CN202211657210.9A CN202211657210A CN115908632A CN 115908632 A CN115908632 A CN 115908632A CN 202211657210 A CN202211657210 A CN 202211657210A CN 115908632 A CN115908632 A CN 115908632A
Authority
CN
China
Prior art keywords
endpoint
pixel point
coordinate value
determining
recursion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211657210.9A
Other languages
Chinese (zh)
Inventor
钱生
于雷
全煜鸣
赵严
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Original Assignee
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd filed Critical Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority to CN202211657210.9A priority Critical patent/CN115908632A/en
Publication of CN115908632A publication Critical patent/CN115908632A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an image processing method, an image processing device, image processing equipment and a readable medium. The method comprises the following steps: determining a marking frame of a region to be processed in an image to be processed; determining a target pixel point shielded by the marking frame; and taking the target pixel point and other pixel points in the labeling frame as pixel points to be processed so as to label the pixel points to be processed as image processing targets. The scheme of the invention can solve the problem that the pixel point passed by the marking frame is not calculated in the image processing scene, determine the target pixel point shielded by the marking frame and take the target pixel point and other selected pixel points in the marking frame as the pixel points to be processed as the image processing target, thereby realizing the pixel-level calculation method applied to image processing and improving the precision of image processing.

Description

Image processing method, device, equipment and readable medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, an image processing device, and a readable medium.
Background
The processing of the image by the image editing software often requires processing such as region labeling of the image, and the processing is usually performed by selecting a part of the image that needs to be processed through a labeling box.
Under the scene that the requirement on the fineness of the region labeling result is extremely high, the operation result is judged by taking a pixel as a unit. When the image is labeled through the labeling frame, the edge of the labeling area has a condition that a part of pixel points are not calculated, namely the pixel points passed by the part of the labeling frame are not calculated. In this case, in an annotation scenario with high precision requirements, there is a pixel missing, which may cause a calculation deviation in other systems depending on the calculation result of the pixel.
Therefore, a method is needed to solve the problem of missing pixel points through which the label frame passes in the image processing process, and improve the accuracy of image processing.
Disclosure of Invention
The invention provides an image processing method, an image processing device, image processing equipment and a readable medium, which are used for solving the problem that pixel points passed by a label frame are lost in the image processing process and improving the image processing precision.
According to an aspect of the present invention, there is provided an image processing method including: determining a marking frame of a region to be processed in an image to be processed; determining a target pixel point shielded by the marking frame; and taking the target pixel point and other pixel points in the labeling frame as pixel points to be processed so as to label the pixel points to be processed as image processing targets.
Optionally, the determining a target pixel point shielded by the labeling frame includes: determining a marking line of the marking frame, wherein the marking line is a straight line forming the marking frame; and determining the pixel point passed by each marking line and taking the passed pixel point as the target pixel point.
Optionally, the determining a pixel point through which each of the marking lines passes includes: determining coordinate values of two end points of each marking line according to a preset coordinate system in the image to be processed; and determining pixel points through which each marking line passes according to the coordinate values.
Optionally, determining a pixel point through which each marking line passes according to the coordinate values includes: for each of the annotation lines, performing: determining a first coordinate value and a second coordinate value respectively corresponding to a first endpoint and a second endpoint of a current marking line; rounding the first coordinate value and the second coordinate value upwards respectively to obtain a third coordinate value and a fourth coordinate value; and taking the pixel points corresponding to the third coordinate value and the fourth coordinate value as the pixel points through which the marking line passes.
Optionally, after the pixel point corresponding to the third coordinate value and the fourth coordinate value is taken as the pixel point through which the annotation line passes, the method further includes: determining whether pixel points corresponding to the third coordinate value and the fourth coordinate value are the same; and if the corresponding pixel points are different, traversing the pixel points through which the current marking line passes, and determining whether other passing pixel points exist in the marking line.
Optionally, the traversing the pixel point through which the current annotation line passes to determine whether there are other passing pixel points in the annotation line includes: determining a fifth coordinate value of a midpoint third endpoint of the current marking line; taking the first endpoint and the second endpoint as a first recursion endpoint, and taking the third endpoint as a second recursion endpoint; determining whether the first recursion endpoint and the second recursion endpoint are in the same pixel point according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint; if the pixel points are not in the same pixel point, the pixel point where the second recursion endpoint is located is used as the pixel point through which the current marking line passes, the middle point of the connecting line of the first recursion endpoint and the second recursion endpoint is used as the second recursion endpoint, and whether the first recursion endpoint and the second recursion endpoint are in the same pixel point or not is determined according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint until the first recursion endpoint and the second recursion endpoint are in the same pixel point.
Optionally, traversing the pixel point through which the current annotation line passes, and determining whether there are other passing pixel points in the annotation line, including: determining a linear equation of the current marking line according to the first coordinate value and the second coordinate value; and traversing the pixel points passed by the current marking line according to the linear equation and the DDA algorithm to determine whether other passed pixel points exist.
According to another aspect of the present invention, there is provided an image processing apparatus including:
the marking frame determining unit is used for determining a marking frame of a to-be-processed area in the to-be-processed image;
the target pixel point determining unit is used for determining the target pixel point shielded by the marking frame;
and the to-be-processed pixel point determining unit is used for taking the target pixel point and other pixel points in the marking frame as to-be-processed pixel points so as to mark the to-be-processed pixel points as image processing targets.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the image processing method according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the image processing method according to any one of the embodiments of the present invention when the computer instructions are executed.
According to the technical scheme of the embodiment of the invention, the marking frame of the area to be processed in the image to be processed is determined; determining a target pixel point shielded by the marking frame; and taking the target pixel point and other pixel points in the labeling frame as pixel points to be processed so as to label the pixel points to be processed as image processing targets. The scheme of the invention can solve the problem that the pixel point passed by the marking frame is not calculated in the image processing scene, determine the target pixel point shielded by the marking frame and take the target pixel point and other selected pixel points in the marking frame as the pixel points to be processed as the image processing target, thereby realizing the pixel-level calculation method applied to image processing and improving the precision of image processing.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a label box according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an arc-shaped label box according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a target pixel point and a label box according to an embodiment of the present invention;
fig. 5 is a flowchart of a method for determining a target pixel according to a second embodiment of the present invention;
fig. 6 is a schematic coordinate diagram of a pixel point and a mark line according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device implementing the image processing method according to the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, where the embodiment is applicable to a case where a partial area in an image is processed, and the method may be executed by an image processing apparatus, where the image processing apparatus may be implemented in a form of hardware and/or software, and the image processing apparatus may be configured in a computer. As shown in fig. 1, the method includes:
and S110, determining a labeling frame of the to-be-processed area in the to-be-processed image.
The marking frame is used for selecting the area to be processed in the image to be processed, and different graphs can be used for marking the image to be processed by the marking frame. Fig. 2 is a schematic diagram of a label box according to an embodiment of the present invention, which is used to show how to determine the position of the to-be-processed area through the label box. Taking the rectangular labeling frame as an example, a coordinate system can be established by taking the vertex at the lower left corner of the image to be processed as an origin, and each pixel point in the image to be processed is taken as a unit. When the region to be processed is marked through the marking frame, the position of the marking frame in the coordinate system is determined according to the coordinates of the four vertexes of the marking frame, so that the relative position of the marking frame and the image to be processed is determined, namely the region to be processed is determined. In fig. 2, a rectangular labeling frame is used as an example, and the principle of determining the position of the region to be processed by other polygonal labeling frames is the same as that described above, and is not described herein again.
In addition, fig. 3 is a schematic diagram of an arc-shaped labeling frame suitable for the embodiment of the present invention, as shown in fig. 3, the arc-shaped labeling frame is also actually composed of a segment of straight line, coordinate points of intersection points of two adjacent straight lines in a preset coordinate system are determined, and a set of the formed coordinate points is a set of coordinate points of the arc-shaped labeling frame, and is used for representing positions of an arc-shaped region to be processed.
And S120, determining a target pixel point shielded by the marking frame.
It is known that image editing software often needs operations such as region labeling on an image. In a scenario where the precision requirement for the region labeling result is extremely high, the operation result is determined in units of pixels.
The prior art is a method for performing pixel-level calculation on an image based on Python language tool library pycocools. The tool library provides a plurality of interfaces, such as frPyObjects interfaces, a plurality of point coordinates of a specified intercepting range in a picture are obtained, the picture is converted into a two-dimensional array composed of 0 and 1, the length and the width of the two-dimensional array are the number of pixels of the length and the width of the picture, pixel points outside a labeling area are represented when the length and the width of the two-dimensional array are 0, and pixel points inside the labeling area are represented when the length and the width of the two-dimensional array are 1, namely the pixel points to be labeled which need to be obtained. Through the tool library, the calculation of the pixel points of the labeling area can be preliminarily realized, but at the edge of the labeling area, the situation that a part of the pixel points are not calculated exists, namely the pixel points through which a part of the labeling frame passes are not calculated. In this case, in an annotation scenario with high precision requirements, there is a pixel missing, which may cause a calculation deviation in other systems depending on the calculation result of the pixel.
S130, taking the target pixel point and other pixel points in the labeling frame as pixel points to be processed, and labeling the pixel points to be processed as image processing targets.
For example, fig. 4 is a schematic diagram of a target pixel point and a label frame suitable for an embodiment of the present invention, as shown in fig. 4, the label frame is a polygon label frame, each square in fig. 4 is a pixel point in an image to be processed, after the polygon label frame is labeled, 4 complete pixel points are included in the label frame, and 12 surrounding pixel points are shielded by the polygon label frame. In the embodiment of the invention, the target pixel point shielded by the labeling frame and other pixel points completely contained by the labeling frame are determined to be used as the pixel points to be processed, so that the pixel level processing in the range of the labeling frame is realized.
The scheme of the invention can solve the problem that the pixel point passed by the marking frame is not calculated in the image processing scene, determine the target pixel point shielded by the marking frame and take the target pixel point and other selected pixel points in the marking frame as the pixel points to be processed as the image processing target, thereby realizing the pixel-level calculation method applied to image processing and improving the precision of image processing.
Example two
Fig. 5 is a flowchart of a method for determining a target pixel point according to a second embodiment of the present invention, which is further explained based on the above embodiments. As shown in fig. 5, the method includes:
s510, determining a marking line of the marking frame, wherein the marking line is a straight line forming the marking frame.
As mentioned above, the mark frame is composed of a plurality of mark lines, and even the edge of the arc mark frame is composed of a segment of straight line. For the polygon marking frame, each edge of the polygon is a marking line forming the marking frame, and for the arc marking frame, the edge of the arc marking frame is also formed by a section of straight lines, and the straight lines are the marking lines of the arc marking frame.
S520, determining the pixel point through which each marking line passes and taking the passing pixel point as the target pixel point.
In the case that the labeling frame covers the pixels, as shown in fig. 4, the labeling line forming the labeling frame passes through 12 peripheral pixels, and these pixels serve as target pixels and pixels in the labeling frame serve as pixels to be processed.
In this embodiment of the present invention, the determining the pixel point through which each of the marking lines passes includes:
determining coordinate values of two end points of each marking line according to a preset coordinate system in the image to be processed; and determining pixel points through which each marking line passes according to the coordinate values.
Fig. 6 is a schematic coordinate diagram of a pixel point and a mark line according to a second embodiment of the present invention, which is illustrated by a straight line unit with a minimum mark frame. As shown in fig. 6, the pixel is represented by the coordinates of the upper right corner of the pixel, the coordinates of the P point represents the upper left pixel, the coordinates of the F point represents the upper right pixel, the coordinates of the O point represents the lower left pixel, and the coordinates of the N point represents the lower right pixel. The coordinates of points a and B represent the two end points of the straight line. When a coordinate system is established, the length unit of a pixel can be set to be 1, so that the coordinates of pixel points are integers, the coordinates of the points A and B of the end points can have decimal numbers, the more the decimal numbers are, the more accurate the calculation result is, and the invention supports the point coordinates of any decimal number. And determining pixel points through which the marking lines pass according to the coordinate values of the endpoints A and B.
In the embodiment of the present invention, determining the pixel point through which each marking line passes according to the coordinate value includes: for each of the annotation lines, performing: determining a first coordinate value and a second coordinate value respectively corresponding to a first endpoint and a second endpoint of a current marking line; rounding the first coordinate value and the second coordinate value upwards respectively to obtain a third coordinate value and a fourth coordinate value; and taking the pixel points corresponding to the third coordinate value and the fourth coordinate value as the pixel points through which the marking line passes.
The length unit of each pixel point is an integer 1, so that the coordinates of the pixel points are integers, the coordinate values of the first end point and the second end point of the marking line are rounded upwards to obtain a rounded third coordinate value and a rounded fourth coordinate value, the third coordinate value and the fourth coordinate value are integers, and the rounded range is within 1 pixel point, so that the first end point and the second end point are respectively in pixels corresponding to the third coordinate value and the fourth coordinate value.
Taking fig. 6 as an example, the coordinates of point a and point B are rounded up to obtain the coordinates of point P and point N, and since the rounded range is within 1 pixel, it is known that point a and point B are located in pixel P and pixel N, respectively.
Optionally, the method for determining whether the point on the straight line is in the pixel point may be replaced by determining the coordinates of the point on the straight line and the coordinates of four vertices of the pixel point.
In this embodiment of the present invention, after the pixel point corresponding to the third coordinate value and the fourth coordinate value is taken as the pixel point through which the annotation line passes, the method further includes: determining whether pixel points corresponding to the third coordinate value and the fourth coordinate value are the same; and if the corresponding pixel points are different, traversing the pixel points passed by the current marking line, and determining whether other passed pixel points exist in the marking line.
For example, in order to calculate the marking line AB passing through the set of all the pixels in the graph, the central idea is to calculate the pixel point where each point on the straight line is located. In the embodiment of the present invention, a bisection method may be used, and the coordinates ((Xa + Xb)/2, (Ya + Yb)/2)) of the middle point between points a and B are taken, and the method used in the foregoing may be used to round the coordinates of the middle point to obtain a fifth coordinate value, and perform a bisection recursion using points a and B as the first recursion end points and the middle point as the second recursion end point, and if the fifth coordinate value and point a are in the same pixel point, it is determined that the line segments between the middle point and point a are both in the same pixel, and the bisection recursion is not continued. If the fifth coordinate value and the point A are not in the same pixel point, the pixel point where the fifth coordinate value is located is taken as a pixel point through which the annotation line passes to be recorded, then an intermediate point (taken as a new second recursion endpoint) is taken between the intermediate point and the point A, the steps are recursively executed between the point A and the point until the point A and the second recursion endpoint are in the same pixel point, and pixel point traversal of one half of the annotation line AB is completed. And if the fifth coordinate value and the point B are in the same pixel point, the line segment between the intermediate point and the point B is judged to be in the same pixel, and the binary recursion is not continued. If the fifth coordinate value and the point B are not in the same pixel point, recording the pixel point where the fifth coordinate value is located as a pixel point through which the annotation line passes, then taking an intermediate point (as a new second recursion endpoint) between the intermediate point and the point B, and recursively executing the steps between the point A and the point until the point B and the second recursion endpoint are in the same pixel point, thereby completing traversal of the other half of the pixel point of the annotation line AB.
In this embodiment of the present invention, the traversing the pixel point through which the current annotation line passes to determine whether there are other passing pixel points in the annotation line includes: determining a linear equation of the current marking line according to the first coordinate value and the second coordinate value; and traversing the pixel points passed by the current marking line according to the linear equation and the DDA algorithm to determine whether other passed pixel points exist.
In the embodiment of the invention, the core idea is to determine the coordinates of the pixel points where all the points on the straight line are located, and the calculation amount is reduced by utilizing the dichotomy according to the judgment idea that two points on the straight line are both in one pixel point and all the points between the two points are both in the pixel point, so that the aim of the invention is achieved. However, the DDA algorithm or any other algorithm that can traverse a point on a line can be used instead of the binary method to accomplish the purpose of the present invention.
Through the above calculation method, all the pixel points through which the straight line AB passes can be obtained. Therefore, the calculation method is executed on all the straight lines of the labeling frame to obtain all the pixel point coordinates passed by the edge straight lines of the labeling frame. And then, combining with the pixel point set obtained in the prior art to obtain all the pixel point sets through which the marking frame passes.
EXAMPLE III
Fig. 7 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present invention. As shown in fig. 7, the apparatus includes:
a labeling frame determining unit 710, configured to determine a labeling frame of a to-be-processed region in the to-be-processed image;
a target pixel point determining unit 720, configured to determine a target pixel point that is blocked by the annotation frame;
and a to-be-processed pixel point determining unit 730, configured to use the target pixel point and other pixel points in the annotation frame as to-be-processed pixel points, so as to label the to-be-processed pixel points as image processing targets.
Optionally, the target pixel point determining unit 720 is configured to determine a mark line of the mark frame, where the mark line is a straight line forming the mark frame; and determining the pixel point passed by each marking line and taking the passed pixel point as the target pixel point.
Optionally, when the target pixel point determining unit 720 performs the determining of the pixel point through which each of the annotation lines passes, the following steps are performed: determining coordinate values of two end points of each marking line according to a preset coordinate system in the image to be processed; and determining the pixel point through which each marking line passes according to the coordinate values.
Optionally, when the target pixel point determining unit 720 determines the pixel point through which each of the annotation lines passes according to the coordinate values, for each of the annotation lines, the following is performed: determining a first coordinate value and a second coordinate value respectively corresponding to a first endpoint and a second endpoint of a current marking line; rounding the first coordinate value and the second coordinate value upwards respectively to obtain a third coordinate value and a fourth coordinate value; and taking the pixel points corresponding to the third coordinate value and the fourth coordinate value as the pixel points through which the marking line passes.
Optionally, after the pixel point corresponding to the third coordinate value and the fourth coordinate value is taken as the pixel point through which the mark line passes, the target pixel point determining unit 720 is further configured to perform:
determining whether pixel points corresponding to the third coordinate value and the fourth coordinate value are the same;
and if the corresponding pixel points are different, traversing the pixel points through which the current marking line passes, and determining whether other passing pixel points exist in the marking line.
Optionally, the target pixel point determining unit 720 executes, when performing traversal on the pixel point through which the current annotation line passes and determining whether there are other passing pixel points in the annotation line, the following steps: determining a fifth coordinate value of a midpoint third endpoint of the current marking line; taking the first endpoint and the second endpoint as a first recursion endpoint, and taking the third endpoint as a second recursion endpoint; determining whether the first recursion endpoint and the second recursion endpoint are in the same pixel point according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint; if the pixel points are not in the same pixel point, the pixel point where the second recursion endpoint is located is used as the pixel point through which the current marking line passes, the middle point of the connecting line of the first recursion endpoint and the second recursion endpoint is used as the second recursion endpoint, and whether the first recursion endpoint and the second recursion endpoint are in the same pixel point or not is determined according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint until the first recursion endpoint and the second recursion endpoint are in the same pixel point.
Optionally, the target pixel point determining unit 720 performs, when performing traversal on the pixel point through which the current annotation line passes and determining whether there are other passing pixel points in the annotation line, the following steps: determining a linear equation of the current marking line according to the first coordinate value and the second coordinate value; and traversing the pixel points passed by the current marking line according to the linear equation and the DDA algorithm to determine whether other passed pixel points exist.
The image processing device provided by the embodiment of the invention can execute the image processing method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
FIG. 8 illustrates a schematic diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 8, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM12, and the RAM13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The processor 11 performs the various methods and processes described above, such as an image processing method.
In some embodiments, the image processing method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM12 and/or the communication unit 19. When the computer program is loaded into the RAM13 and executed by the processor 11, one or more steps of the image processing method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the image processing method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An image processing method, comprising:
determining a marking frame of a region to be processed in an image to be processed;
determining a target pixel point shielded by the marking frame;
and taking the target pixel point and other pixel points in the labeling frame as pixel points to be processed so as to label the pixel points to be processed as image processing targets.
2. The method of claim 1, wherein the determining a target pixel point occluded by the annotation box comprises:
determining a marking line of the marking frame, wherein the marking line is a straight line forming the marking frame;
and determining the pixel point passed by each marking line and taking the passed pixel point as the target pixel point.
3. The method of claim 2, wherein the determining the pixel point through which each of the annotation lines passes comprises:
determining coordinate values of two end points of each marking line according to a preset coordinate system in the image to be processed;
and determining pixel points through which each marking line passes according to the coordinate values.
4. The method of claim 3, wherein determining the pixel point through which each of the annotation lines passes based on the coordinate values comprises:
for each of the annotation lines, performing:
determining a first coordinate value and a second coordinate value respectively corresponding to a first endpoint and a second endpoint of a current marking line;
rounding the first coordinate value and the second coordinate value upwards respectively to obtain a third coordinate value and a fourth coordinate value;
and taking the pixel points corresponding to the third coordinate value and the fourth coordinate value as the pixel points through which the marking line passes.
5. The method according to claim 4, wherein after the step of taking the pixel point corresponding to the third coordinate value and the fourth coordinate value as the pixel point through which the mark line passes, the method further comprises:
determining whether pixel points corresponding to the third coordinate value and the fourth coordinate value are the same;
and if the corresponding pixel points are different, traversing the pixel points passed by the current marking line, and determining whether other passed pixel points exist in the marking line.
6. The method of claim 5, wherein the traversing the pixel point passed by the current annotation line to determine whether there are other passed pixel points in the annotation line comprises:
determining a fifth coordinate value of a midpoint third endpoint of the current marking line;
taking the first endpoint and the second endpoint as a first recursion endpoint, and taking the third endpoint as a second recursion endpoint;
determining whether the first recursion endpoint and the second recursion endpoint are in the same pixel point according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint;
if the pixel points are not in the same pixel point, the pixel point where the second recursion endpoint is located is used as the pixel point through which the current marking line passes, the middle point of the connecting line of the first recursion endpoint and the second recursion endpoint is used as the second recursion endpoint, and whether the first recursion endpoint and the second recursion endpoint are in the same pixel point or not is determined according to the coordinate value of the first recursion endpoint and the coordinate value of the second recursion endpoint until the first recursion endpoint and the second recursion endpoint are in the same pixel point.
7. The method of claim 3, wherein the traversing the pixel point passed by the current annotation line to determine whether there are other passed pixel points in the annotation line comprises:
determining a linear equation of the current marking line according to the first coordinate value and the second coordinate value;
and traversing the pixel points passed by the current marking line according to the linear equation and the DDA algorithm to determine whether other passed pixel points exist.
8. An image processing apparatus, comprising:
the marking frame determining unit is used for determining a marking frame of a to-be-processed area in the to-be-processed image;
the target pixel point determining unit is used for determining the target pixel point shielded by the marking frame;
and the to-be-processed pixel point determining unit is used for taking the target pixel point and other pixel points in the marking frame as to-be-processed pixel points so as to mark the to-be-processed pixel points as image processing targets.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the image processing method of any one of claims 1-7.
10. A computer-readable storage medium, having stored thereon computer instructions for causing a processor, when executed, to implement the image processing method of any one of claims 1-7.
CN202211657210.9A 2022-12-22 2022-12-22 Image processing method, device, equipment and readable medium Pending CN115908632A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211657210.9A CN115908632A (en) 2022-12-22 2022-12-22 Image processing method, device, equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211657210.9A CN115908632A (en) 2022-12-22 2022-12-22 Image processing method, device, equipment and readable medium

Publications (1)

Publication Number Publication Date
CN115908632A true CN115908632A (en) 2023-04-04

Family

ID=86488035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211657210.9A Pending CN115908632A (en) 2022-12-22 2022-12-22 Image processing method, device, equipment and readable medium

Country Status (1)

Country Link
CN (1) CN115908632A (en)

Similar Documents

Publication Publication Date Title
KR102293479B1 (en) An intersection drawing method, an apparatus, a server and a storage medium
EP3876197A2 (en) Portrait extracting method and apparatus, electronic device and storage medium
CN113223113B (en) Lane line processing method and device, electronic equipment and cloud control platform
CN113362420A (en) Road marking generation method, device, equipment and storage medium
CN113378696A (en) Image processing method, device, equipment and storage medium
CN113205090B (en) Picture correction method, device, electronic equipment and computer readable storage medium
CN114882313B (en) Method, device, electronic equipment and storage medium for generating image annotation information
CN115908632A (en) Image processing method, device, equipment and readable medium
CN114723894B (en) Three-dimensional coordinate acquisition method and device and electronic equipment
CN114912403A (en) PCB automatic labeling method, device, equipment and storage medium
CN113190150A (en) Display method, device and storage medium of covering
CN115174774A (en) Depth image compression method, device, equipment and storage medium
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
CN114445682A (en) Method, device, electronic equipment, storage medium and product for training model
CN113470145A (en) Map data processing method, map data processing device, map data processing equipment and storage medium
CN113657408A (en) Method and device for determining image characteristics, electronic equipment and storage medium
CN113378958A (en) Automatic labeling method, device, equipment, storage medium and computer program product
CN113947146A (en) Sample data generation method, model training method, image detection method and device
CN113361371A (en) Road extraction method, device, equipment and storage medium
CN115328607B (en) Semiconductor device rendering method, device, equipment and storage medium
CN113392811B (en) Table extraction method and device, electronic equipment and storage medium
CN114528366A (en) Map information tracing method, device, equipment, storage medium and program product
CN114485716A (en) Lane rendering method and device, electronic equipment and storage medium
CN116186181A (en) Visualization method and device for power flow diagram and electronic equipment
CN117067787A (en) Image jet printing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination