CN109697722B - Method and device for generating trisection graph - Google Patents

Method and device for generating trisection graph Download PDF

Info

Publication number
CN109697722B
CN109697722B CN201710994495.8A CN201710994495A CN109697722B CN 109697722 B CN109697722 B CN 109697722B CN 201710994495 A CN201710994495 A CN 201710994495A CN 109697722 B CN109697722 B CN 109697722B
Authority
CN
China
Prior art keywords
image
pixel
generating
trimap
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710994495.8A
Other languages
Chinese (zh)
Other versions
CN109697722A (en
Inventor
安山
陈宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201710994495.8A priority Critical patent/CN109697722B/en
Publication of CN109697722A publication Critical patent/CN109697722A/en
Application granted granted Critical
Publication of CN109697722B publication Critical patent/CN109697722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware

Abstract

The application discloses a method and a device for generating a trisection graph. Relates to the field of digital image processing, and the method comprises the following steps: performing image processing on a preset image to generate a first image, wherein the first image is a binary image; generating pixel position data according to a preset parameter and a pixel point in the first image; performing pixel value judgment on all pixel points in the first image and pixel points around the first image according to the pixel position data, and generating a second image according to a judgment result; and generating a trimap image of a predetermined image through the first image and the second image. The method and the device for generating the trisection map can generate the trisection map on the basis of only a binary image, and improve the efficiency of generating the trisection map.

Description

Method and device for generating trisection graph
Technical Field
The invention relates to the field of digital image processing, in particular to a method and a device for generating a trisection graph.
Background
Matting (image segmentation) is a special image segmentation technique, which is a technique and process for dividing an image into several specific regions with unique properties and proposing an object of interest, and is a key step from image processing to image analysis. In the matting technique, a binary image is often used for further processing in the process of video synthesis. A binary image is an image obtained by image segmentation, in which only an object foreground and an object background are present, and each pixel in the image belongs to either the foreground or the background. In the binary image, the attribute values of the image are divided into foreground and background, the pixel value of the foreground part of the object is set to 0, and the pixel value of the background of the object is set to 255. Because the binary image is a single-channel image, and when the matting technology is used for processing, the transparency of the edge of an object needs to be calculated when the image is synthesized, in the matting technology, the required input image is a trimap image. In the trimap image, there are three different pixel values of a foreground portion, a background portion and an uncertain region. The foreground portion pixel is set to 255, the background portion pixel is set to 0, and the indeterminate region pixel value is set to 128.
In the prior art, a binary image is processed in the following manner to generate a trimap image. Application No.: CN201710115845.9 discloses a fast matting method for single background image. The background color of the image is automatically identified by sampling the image boundary, and the RGB color model is converted into the HSI color model, so that the influence of illumination is reduced, and the automatic segmentation of the foreground and the background is realized; secondly, a trimap is obtained by separately eroding the foreground and the background. Application No.: CN201510260422.7 patent. An image foreground extraction method based on superpixels and a fast three-division graph is disclosed. Performing superpixel preprocessing by adopting an algorithm; extracting superpixel features such as average color features and barycentric position coordinates according to the extracted superpixel blocks; and then, smoothing the primary hard segmentation edge to obtain the final hard segmentation edge, extracting a foreground image and outputting the foreground image. In the above methods, the trimap image can be generated, but the full image needs to be considered comprehensively to extract the trimap image, and the actual use efficiency is low.
Therefore, a new method and apparatus for generating a trimap image are needed.
The above information disclosed in this background section is only for enhancement of understanding of the background of the invention and therefore it may contain information that does not constitute prior art that is already known to a person of ordinary skill in the art.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for generating a trimap image, which can generate a trimap image on the basis of only a binary image and improve the efficiency of generating the trimap image.
Additional features and advantages of the invention will be set forth in the detailed description which follows, or may be learned by practice of the invention.
According to an aspect of the present invention, a method for generating a trimap image is provided, the method comprising: performing image processing on a preset image to generate a first image, wherein the first image is a binary image; generating pixel position data according to a preset parameter and a pixel point in the first image; performing pixel value judgment on all pixel points in the first image and pixel points around the first image according to the pixel position data, and generating a second image according to a judgment result; and generating a trimap image of a predetermined image through the first image and the second image.
In an exemplary embodiment of the present disclosure, further comprising: and carrying out cutout processing on the preset image through the trisection image.
In an exemplary embodiment of the present disclosure, the image processing of the predetermined image to generate the first image includes: performing image segmentation processing on the preset image to obtain segmentation data; and generating a first image from the segmentation data.
In an exemplary embodiment of the present disclosure, the generating pixel position data according to a predetermined parameter and a pixel point in the first image includes: determining a preset parameter value; generating preset pixel position data around each pixel point in the first image according to the preset parameters; and marking the pixel position data by a preset numerical value.
In an exemplary embodiment of the present disclosure, the generating of the predetermined number of pixel position data by the preset parameter includes: n ═ 2 xm +1)2-1; wherein, N is the predetermined number of pixels, and M is a predetermined parameter and is a positive integer.
In an exemplary embodiment of the present disclosure, the pixel value determination is performed on all the pixel points in the first image and the surrounding pixel points thereof according to the pixel position dataAnd generating a second image according to the judgment result, wherein the method comprises the following steps: obtaining a pixel point i in a first imagen(ii) a Obtaining pixel point i according to the pixel position datanN surrounding pixel points; to pixel point inComparing the pixel values with the pixel values of N surrounding pixel points respectively; and when the pixel point inWhen the pixel value of the pixel point i is different from the pixel values of the surrounding N pixel points, the pixel point i is processednSet to a predetermined value.
In an exemplary embodiment of the present disclosure, further comprising: and sequentially reading the pixel points in the first image, and comparing the pixel values until all the pixel points in the first image are traversed.
In an exemplary embodiment of the present disclosure, further comprising: and generating the second image according to the pixel value comparison result.
In an exemplary embodiment of the present disclosure, the generating a trimap image of a predetermined image by the first image and the second image includes: generating a third image with the same size as the first image, wherein all pixel values of the third image are 255; generating a fourth image with the same size as the first image, wherein all pixel values of the fourth image are 128; and generating a trimap image of the predetermined image from the second image, the third image and the fourth image.
In an exemplary embodiment of the present disclosure, the generating a trimap image of the predetermined image from the second image, the third image and the fourth image includes: performing image superposition on the second image, the third image and the fourth image through a trimap image calculation formula to generate a trimap image of the predetermined image;
wherein the formula of the trisection graph is as follows:
Itrimap=Ibinary-Itemp1+Itemp2
wherein ItrimapIs a trisection diagram, IbinaryFor the second image, Itemp1For the third image, Itemp2Is the fourth image.
According to an aspect of the present invention, a method for generating a trimap image is provided, the method comprising: the image processing device comprises a first module, a second module and a third module, wherein the first module is used for carrying out image processing on a preset image to generate a first image, and the first image is a binary image; the position module is used for generating pixel position data according to preset parameters and pixel points in the first image; the judging module is used for judging the pixel values of all the pixel points in the first image and the pixel points around the pixel points through the pixel position data and generating a second image according to the judgment result; and a trimap image module for generating a trimap image of a predetermined image through the first image and the second image.
In an exemplary embodiment of the present disclosure, further comprising: and the matting module is used for carrying out matting processing on the preset image through the trisection image.
According to the method and the device for generating the trisection map, the trisection map can be generated on the basis of only the binary image, and the efficiency of generating the trisection map is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. The drawings described below are only some embodiments of the invention and other drawings may be derived from those drawings by a person skilled in the art without inventive effort.
FIG. 1 is a system architecture illustrating a method for generating a trimap image in accordance with an exemplary embodiment.
FIG. 2 is a flow chart illustrating a method for generating a trimap image in accordance with an exemplary embodiment.
FIG. 3 is a schematic illustration in a method for generating a trimap, according to another exemplary embodiment.
FIG. 4 is a flowchart illustrating a method for generating a trimap image in accordance with another exemplary embodiment.
FIG. 5 is a diagram illustrating image processing in a method for generating a trimap image in accordance with an exemplary embodiment.
FIG. 6 is a diagram illustrating image processing in a method for generating a trimap image in accordance with an exemplary embodiment.
FIG. 7 is a diagram illustrating image processing in a method for generating a trimap image in accordance with an exemplary embodiment.
FIG. 8 is a block diagram illustrating an apparatus for generating a trimap, according to an example embodiment.
FIG. 9 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 10 is a schematic diagram illustrating a computer readable medium according to an example embodiment.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations or operations have not been shown or described in detail to avoid obscuring aspects of the invention.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various components, these components should not be limited by these terms. These terms are used to distinguish one element from another. Thus, a first component discussed below may be termed a second component without departing from the teachings of the disclosed concept. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be appreciated by those skilled in the art that the drawings are merely schematic representations of exemplary embodiments, and that the blocks or flow charts in the drawings are not necessarily required to practice the present invention and are, therefore, not intended to limit the scope of the present invention.
The following detailed description of exemplary embodiments of the disclosure refers to the accompanying drawings.
FIG. 1 is a system architecture illustrating a method for generating a trimap image in accordance with an exemplary embodiment.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as a shopping application, a web browser application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server that provides various services, such as processing image processing requests submitted by users using the terminal apparatuses 101, 102, 103. The background management server may analyze and otherwise process the received data such as the request, and feed back a processing result (e.g., an image processing result, a matting processing result, and the like) to the terminal device.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
FIG. 2 is a flow chart illustrating a method for generating a trimap image in accordance with an exemplary embodiment.
As shown in fig. 2, in S202, a predetermined image is subjected to image processing to generate a first image, which is a binary image. May for example include: performing image segmentation processing on the preset image to obtain segmentation data; and generating a first image from the segmentation data. As described above, in the first image, the pixel value of the foreground portion of the object is set to 0, and the background of the object is set to 255. Image segmentation is commonly used to locate objects and boundaries (lines, curves, etc.) in images. Image segmentation is a process of labeling each pixel in an image such that pixels with the same label have some common visual characteristic. The result of image segmentation is a set of sub-regions on the image, or a set of contour lines extracted from the image. Each pixel in a sub-area is similar under some measure of a property or a property derived by calculation, e.g. color, brightness, texture. The adjacent regions differ greatly in some measure of the characteristic. Commonly used image segmentation algorithms include: the method is based on edge detection, threshold segmentation, region growing, region splitting and merging, and the like, and the application is not limited thereto.
In S204, pixel position data is generated according to a predetermined parameter and a pixel point in the first image. May for example include: determining a preset parameter value; generating preset pixel position data around each pixel point in the first image according to the preset parameters; and marking the pixel position data by a preset numerical value. Wherein the generating of the predetermined pixel position data by the preset parameter includes: n ═ 2 xm +1)2-1; wherein, N is the predetermined number of pixels, and M is a predetermined parameter and is a positive integer.
In one embodiment, a series of pixel location data is generated by the following method. If the pixels of the split edge are extended by 1 pixel both outward and inward, which may be noted as M ═ 1, for example, the resulting pixel positions are-M to M, excluding the origin (0, 0). The number of generated pixels is N ═ (2 × M +1)2-1, wherein M is a preset parameter and is a positive integer. Referring to the example of fig. 3, when M is 1, the pixel position data of a certain pixel point to be calculated is 8, and when M is 2, the pixel position data of a certain pixel point to be calculated is 25. These pixel locations may also be assigned values, for example, for differentiation, and the pixel location values may be as shown in fig. 3, for example.
In another embodiment, the pixels of the split edge are extended by 2 pixels both outward and inward, denoted as M-2. Referring to the example of fig. 3, when M is 2, the pixel position data of a certain pixel point to be calculated is 25. These pixel locations may also be assigned values, for example, for differentiation, and the values of the pixel locations may be as shown in fig. 3, for example.
In S206, pixel values of all the pixels in the first image and the surrounding pixels are determined according to the pixel position data, and a second image is generated according to the determination result. For example, for each pixel in the first image, a numerical comparison of the pixel values is performed with N other surrounding pixel points. And according to the numerical comparison result, the attribute numerical value of the area representing the pixel is assigned again (set as an uncertain area) to generate a second image.
In S208, a trimap image of a predetermined image is generated from the first image and the second image. Generating a third image with the same size as the first image, wherein all pixel values of the third image are 255; generating a fourth image with the same size as the first image, wherein all pixel values of the fourth image are 128; and image-superimposing the second image, the third image and the fourth image to generate a trimap image of a predetermined image.
According to the method for generating the trisection map, each pixel point in the binary image is compared and calculated to obtain the uncertain region, and then the trisection map is generated, so that the trisection map can be generated on the basis of only the binary image, and the efficiency of generating the trisection map is improved.
In an exemplary embodiment of the present disclosure, further comprising: and carrying out cutout processing on the preset image through the trisection image. In the field of image Matting processing, processing algorithms such as bayesian Matting, kNN Matting and Poisson Matting can be included, and the processing algorithms can be input into the above Matting algorithm for processing by the method of the present application.
It should be clearly understood that the present disclosure describes how to make and use particular examples, but the principles of the present disclosure are not limited to any details of these examples. Rather, these principles can be applied to many other embodiments based on the teachings of the present disclosure.
FIG. 4 is a flowchart illustrating a method for generating a trimap image in accordance with another exemplary embodiment.
In S402, a pixel point i in the first image is obtainedn
In S404, a pixel point i is obtained according to the pixel position datanAnd N surrounding pixel points. It is possible, for example, to obtain 25 other pixel points in the positions around the pixel point to be calculated when M is 2.
In S406, pixel point inThe pixel values of (a) are compared with the pixel values of the surrounding N pixel points respectively. Wherein the image of 25 other pixel points can be read, for exampleThe prime value. Since the first pixel map is a binary image, the pixel value may be 255 or 0.
In S408, when the pixel point inWhen the pixel value of the pixel point i is different from the pixel values of the surrounding N pixel points, the pixel point i is processednIs set to a predetermined value. Can be, for example, when pixel point inWhen the pixel value of (i) is different from the pixel values of the surrounding N pixels, the pixel point i is determinednRegarding as an uncertain region, and further determining the pixel point inIs assigned a value of 128. When the pixel point inWhen the values of the peripheral 25 pixel points are all the same, keeping the pixel point inThe original value of (a) is unchanged.
In S410, it is determined whether all the pixel points in the first image have been traversed, and if not, the pixel points are continuously read for processing. If the traversal has been completed, S412 is entered.
In S412, a second image is generated. The first image after the pixel value reassignment process may be, for example, treated as the second image.
In an exemplary embodiment of the present disclosure, the generating a trimap image of a predetermined image by the first image and the second image includes: generating a third image with the same size as the first image, wherein all pixel values of the third image are 255; generating a fourth image with the same size as the first image, wherein all pixel values of the fourth image are 128; and generating a trimap image of the predetermined image from the second image, the third image and the fourth image.
In an exemplary embodiment of the present disclosure, the generating a trimap image of the predetermined image from the second image, the third image and the fourth image includes: performing image superposition on the second image, the third image and the fourth image through a trimap image calculation formula to generate a trimap image of the predetermined image;
wherein the formula of the trisection graph is as follows:
Itrimap=Ibinary-Itemp1+Itemp2
wherein ItrimapIs a trisection diagram, IbinaryFor the second image, Itemp1For the third image, Itemp2Is the fourth image.
Fig. 5, 6 and 7 are schematic diagrams illustrating image processing in a method for generating a trimap according to an exemplary embodiment. Fig. 5 illustrates a predetermined input image according to an exemplary embodiment, fig. 6 illustrates a first image subjected to an image segmentation process according to an exemplary embodiment, and fig. 6 illustrates a binary image, i.e., only two pixels, i.e., black and white. Fig. 7 illustrates a trimap effect after pixel processing in the present invention according to an exemplary embodiment.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as computer programs executed by a CPU. The computer program, when executed by the CPU, performs the functions defined by the method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
The following are embodiments of the apparatus of the present invention that may be used to perform embodiments of the method of the present invention. For details which are not disclosed in the embodiments of the apparatus of the present invention, reference is made to the embodiments of the method of the present invention.
FIG. 8 is a block diagram illustrating an apparatus for generating a trimap, according to an example embodiment.
The first module 802 is configured to perform image processing on a predetermined image to generate a first image, where the first image is a binary image.
The location module 804 is configured to generate pixel location data according to a predetermined parameter and a pixel point in the first image.
The judging module 806 is configured to perform pixel value judgment on all pixel points in the first image and pixel points around the pixel points according to the pixel position data, and generate a second image according to a judgment result.
The trimap image module 808 is configured to generate a trimap image of a predetermined image through the first image and the second image.
In an exemplary embodiment of the present disclosure, further comprising: and the matting module (not shown in the figure) is used for carrying out the matting processing on the preset image through the trisection image.
According to the device for generating the trisection map, each pixel point in the binary image is compared and calculated to obtain the uncertain region, and then the trisection map is generated, so that the trisection map can be generated on the basis of only the binary image, and the efficiency of generating the trisection map is improved.
FIG. 9 is a block diagram illustrating an electronic device in accordance with an example embodiment.
An electronic device 200 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 200 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 200 is embodied in the form of a general purpose computing device. The components of the electronic device 200 may include, but are not limited to: at least one processing unit 210, at least one memory unit 220, a bus 230 connecting different system components (including the memory unit 220 and the processing unit 210), a display unit 240, and the like.
Wherein the storage unit stores program code executable by the processing unit 210 to cause the processing unit 210 to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, the processing unit 210 may perform the steps as shown in fig. 2 and 3.
The memory unit 220 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)2201 and/or a cache memory unit 2202, and may further include a read only memory unit (ROM) 2203.
The storage unit 220 may also include a program/utility 2204 having a set (at least one) of program modules 2205, such program modules 2205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 230 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 200 may also communicate with one or more external devices 300 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 200, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 200 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 250. Also, the electronic device 200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 260. The network adapter 260 may communicate with other modules of the electronic device 200 via the bus 230. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 200, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, or a network device, etc.) to execute the above-mentioned electronic prescription flow processing method according to the embodiments of the present disclosure.
FIG. 10 is a schematic diagram illustrating a computer readable medium according to an example embodiment.
Referring to fig. 10, a program product 400 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The computer readable medium carries one or more programs which, when executed by a device, cause the computer readable medium to perform the functions of: performing image processing on a preset image to generate a first image, wherein the first image is a binary image; generating pixel position data according to a preset parameter and a pixel point in the first image; performing pixel value judgment on all pixel points in the first image and pixel points around the first image according to the pixel position data, and generating a second image according to a judgment result; and generating a trimap image of a predetermined image through the first image and the second image.
Those skilled in the art will appreciate that the modules described above may be distributed in the apparatus according to the description of the embodiments, or may be modified accordingly in one or more apparatuses unique from the embodiments. The modules of the above embodiments may be combined into one module, or further split into multiple sub-modules.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiment of the present invention.
Exemplary embodiments of the present invention are specifically illustrated and described above. It is to be understood that the invention is not limited to the precise construction, arrangements, or instrumentalities described herein; on the contrary, the invention is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
In addition, the structures, the proportions, the sizes, and the like shown in the drawings of the present specification are only used for matching with the contents disclosed in the specification, so as to be understood and read by those skilled in the art, and are not used for limiting the limit conditions which the present disclosure can implement, so that the present disclosure has no technical essence, and any modification of the structures, the change of the proportion relation, or the adjustment of the sizes, should still fall within the scope which the technical contents disclosed in the present disclosure can cover without affecting the technical effects which the present disclosure can produce and the purposes which can be achieved. In addition, the terms "above", "first", "second" and "a" as used in the present specification are for the sake of clarity only, and are not intended to limit the scope of the present disclosure, and changes or modifications of the relative relationship may be made without substantial technical changes and modifications.

Claims (11)

1. A method for generating a trimap image, comprising:
performing image processing on a preset image to generate a first image, wherein the first image is a binary image;
determining a preset parameter value M, wherein M is a positive integer;
generating preset pixel position data around each pixel point in the first image according to the preset parameter M;
marking the pixel position data by a preset numerical value;
performing pixel value judgment on all pixel points in the first image and pixel points around the first image according to the pixel position data, and generating a second image according to a judgment result; and
generating a trimap image of a predetermined image through the first image and the second image;
wherein, around each pixel point in the first image, generating predetermined pixel position data by the preset parameter includes:
taking a target pixel point in the first image as a central point, taking (2M +1) pixel units as side lengths to form a square, wherein all pixel points except the target pixel point in the square are preset pixel position data of the target pixel point;
wherein the generating a trimap image of a predetermined image through the first image and the second image comprises: generating a third image with the same size as the first image, wherein all pixel values of the third image are 255; generating a fourth image with the same size as the first image, wherein all pixel values of the fourth image are 128; and generating a trimap image of the predetermined image from the second image, the third image and the fourth image;
wherein the generating a trimap image of the predetermined image from the second image, the third image and the fourth image comprises:
performing image superposition on the second image, the third image and the fourth image through a trimap image calculation formula to generate a trimap image of the predetermined image;
wherein the formula of the trisection graph is as follows:
Itrimap=Ibinary-Itemp1+Itemp2
wherein ItrimapIs a trisection diagram, IbinaryFor the second image, Itemp1For the third image, Itemp2Is the fourth image.
2. The method of claim 1, further comprising:
and carrying out cutout processing on the preset image through the trisection image.
3. The method of claim 1, wherein the image processing of the predetermined image to generate the first image comprises:
performing image segmentation processing on the preset image to obtain segmentation data; and
a first image is generated from the segmentation data.
4. The method of claim 1, wherein making a square with a target pixel in the first image as a center point and (2M +1) pixel units as side lengths, all pixels in the square except the target pixel being predetermined pixel position data of the target pixel, comprises:
N=(2×M+1)2-1;
and N is the preset number of pixels of the target pixel point, and M is the preset parameter value and is a positive integer.
5. The method according to claim 4, wherein the determining pixel values of all the pixels in the first image and the surrounding pixels according to the pixel position data, and generating the second image according to the determination result comprises:
obtaining a pixel point i in a first imagen
Obtaining pixel point i according to the pixel position datanN surrounding pixel points;
to pixel point inRespectively with its surrounding N pixelsComparing the pixel values; and
when pixel point inWhen the pixel value of the pixel point i is different from the pixel values of the surrounding N pixel points, the pixel point i is processednIs set to a predetermined value.
6. The method of claim 5, further comprising:
and sequentially reading the pixel points in the first image, and comparing the pixel values until all the pixel points in the first image are traversed.
7. The method of claim 6, further comprising:
and generating the second image according to the pixel value comparison result of each pixel point.
8. An apparatus for generating a trimap image, comprising:
the image processing device comprises a first module, a second module and a third module, wherein the first module is used for carrying out image processing on a preset image to generate a first image, and the first image is a binary image;
the position module is used for determining a preset parameter value M, wherein M is a positive integer; generating preset pixel position data around each pixel point in the first image according to the preset parameter M; marking the pixel position data by a preset numerical value;
the judging module is used for judging the pixel values of all the pixel points in the first image and the pixel points around the pixel points through the pixel position data and generating a second image according to the judgment result; and
the third map dividing module is used for generating a third map of a preset image through the first image and the second image;
wherein, around each pixel point in the first image, generating predetermined pixel position data by the preset parameter includes:
taking a target pixel point in the first image as a central point, taking (2M +1) pixel units as side lengths to form a square, wherein all pixel points except the target pixel point in the square are preset pixel position data of the target pixel point;
wherein the generating a trimap image of a predetermined image through the first image and the second image comprises: generating a third image with the same size as the first image, wherein all pixel values of the third image are 255; generating a fourth image with the same size as the first image, wherein all pixel values of the fourth image are 128; and generating a trimap image of the predetermined image from the second image, the third image and the fourth image;
wherein the generating a trimap image of the predetermined image from the second image, the third image and the fourth image comprises:
performing image superposition on the second image, the third image and the fourth image through a trimap image calculation formula to generate a trimap image of the predetermined image;
wherein the formula of the trisection graph is as follows:
Itrimap=Ibinary-Itemp1+Itemp2
wherein ItrimapIs a trisection diagram, IbinaryFor the second image, Itemp1For the third image, Itemp2Is the fourth image.
9. The apparatus of claim 8, further comprising:
and the matting module is used for carrying out matting processing on the preset image through the trisection image.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
11. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN201710994495.8A 2017-10-23 2017-10-23 Method and device for generating trisection graph Active CN109697722B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710994495.8A CN109697722B (en) 2017-10-23 2017-10-23 Method and device for generating trisection graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710994495.8A CN109697722B (en) 2017-10-23 2017-10-23 Method and device for generating trisection graph

Publications (2)

Publication Number Publication Date
CN109697722A CN109697722A (en) 2019-04-30
CN109697722B true CN109697722B (en) 2021-10-15

Family

ID=66225853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710994495.8A Active CN109697722B (en) 2017-10-23 2017-10-23 Method and device for generating trisection graph

Country Status (1)

Country Link
CN (1) CN109697722B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110400323B (en) * 2019-07-30 2020-11-24 上海艾麒信息科技股份有限公司 Automatic cutout system, method and device
CN112070708B (en) * 2020-08-21 2024-03-08 杭州睿琪软件有限公司 Image processing method, image processing apparatus, electronic device, and storage medium
CN112634314A (en) * 2021-01-19 2021-04-09 深圳市英威诺科技有限公司 Target image acquisition method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673400A (en) * 2008-09-08 2010-03-17 索尼株式会社 Image processing apparatus, method, and program
CN102999897A (en) * 2011-09-19 2013-03-27 香港中文大学 Method and device for sea surface oil spillage detection based on SAR (synthetic aperture radar) image
CN104036517A (en) * 2014-07-01 2014-09-10 成都品果科技有限公司 Image matting method based on gradient sampling
CN107730528A (en) * 2017-10-28 2018-02-23 天津大学 A kind of interactive image segmentation and fusion method based on grabcut algorithms

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011034178A (en) * 2009-07-30 2011-02-17 Sony Corp Apparatus and method for image processing, and program
JP6355346B2 (en) * 2014-01-29 2018-07-11 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
KR101624801B1 (en) * 2014-10-15 2016-05-26 포항공과대학교 산학협력단 Matting method for extracting object of foreground and apparatus for performing the matting method
CN107123088B (en) * 2017-04-21 2019-09-13 山东大学 A kind of method of automatic replacement photo background color

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673400A (en) * 2008-09-08 2010-03-17 索尼株式会社 Image processing apparatus, method, and program
CN102999897A (en) * 2011-09-19 2013-03-27 香港中文大学 Method and device for sea surface oil spillage detection based on SAR (synthetic aperture radar) image
CN104036517A (en) * 2014-07-01 2014-09-10 成都品果科技有限公司 Image matting method based on gradient sampling
CN107730528A (en) * 2017-10-28 2018-02-23 天津大学 A kind of interactive image segmentation and fusion method based on grabcut algorithms

Also Published As

Publication number Publication date
CN109697722A (en) 2019-04-30

Similar Documents

Publication Publication Date Title
CN108830780B (en) Image processing method and device, electronic device and storage medium
CN109858333B (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN109344762B (en) Image processing method and device
US10152213B2 (en) Techniques for selecting objects in images
CN110717919A (en) Image processing method, device, medium and computing equipment
CN109118456B (en) Image processing method and device
CN109377508B (en) Image processing method and device
CN109711508B (en) Image processing method and device
CN109697722B (en) Method and device for generating trisection graph
CN108182457B (en) Method and apparatus for generating information
CN112749695A (en) Text recognition method and device
CN110633717A (en) Training method and device for target detection model
CN114792355A (en) Virtual image generation method and device, electronic equipment and storage medium
US20210319264A1 (en) Resolving training dataset category ambiguity
CN112508005B (en) Method, apparatus, device and storage medium for processing image
CN113837194A (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN112651399A (en) Method for detecting same-line characters in oblique image and related equipment thereof
CN115082598B (en) Text image generation, training, text image processing method and electronic equipment
CN110956131A (en) Single-target tracking method, device and system
CN114926322B (en) Image generation method, device, electronic equipment and storage medium
CN114511862B (en) Form identification method and device and electronic equipment
CN113947146A (en) Sample data generation method, model training method, image detection method and device
WO2022105120A1 (en) Text detection method and apparatus from image, computer device and storage medium
CN115082298A (en) Image generation method, image generation device, electronic device, and storage medium
CN114882283A (en) Sample image generation method, deep learning model training method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant