WO2012114386A1 - Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image - Google Patents

Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image Download PDF

Info

Publication number
WO2012114386A1
WO2012114386A1 PCT/JP2011/001106 JP2011001106W WO2012114386A1 WO 2012114386 A1 WO2012114386 A1 WO 2012114386A1 JP 2011001106 W JP2011001106 W JP 2011001106W WO 2012114386 A1 WO2012114386 A1 WO 2012114386A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
edge
partial
image
importance
Prior art date
Application number
PCT/JP2011/001106
Other languages
English (en)
Japanese (ja)
Inventor
進也 田口
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2011/001106 priority Critical patent/WO2012114386A1/fr
Publication of WO2012114386A1 publication Critical patent/WO2012114386A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/41Bandwidth or redundancy reduction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Definitions

  • the present invention relates to, for example, an image vectorization apparatus, an image vectorization method, and an image vectorization program for converting a raster image into a vector image.
  • Non-Patent Document 1 proposes an image vectorization method for the purpose of facilitating image editing and preventing blurring and collapse of the shape even when the image is enlarged or reduced. That is, the following Non-Patent Document 1 describes an image that realizes image vectorization by dividing a raster image into mesh-like partial areas and approximating pixel values in each partial area with a bicubic surface. A vectorization device is disclosed.
  • raster image refers to an image represented by a set of colored points
  • vector image represents an equation in which the image is represented by dots, lines, and planes, and color information is a parametric equation. This refers to the image represented by.
  • Image vectorization refers to converting a raster image into a vector image.
  • Point refers to a two-dimensional coordinate value, a three-dimensional coordinate value, or an N-dimensional coordinate value
  • Line refers to a straight line or curve connecting two points
  • Surface is surrounded by a plurality of lines. It refers to the area that is.
  • the present invention has been made to solve the above-described problems, and can dynamically change the amount of data used for image data in accordance with the reduction in scale or viewpoint, and can be applied to image data at random.
  • An object is to obtain an image vectorization apparatus, an image vectorization method and an image vectorization program which can be accessed.
  • An image vectorization apparatus detects an edge existing in a raster image, determines edge importance in the raster image, and detects an edge detected by the edge detection unit.
  • the edge having the highest importance determined by the edge detection means is used to divide the raster image into a plurality of partial areas, and the edges having lower importance than the edges are used in order,
  • an area dividing means for hierarchizing the divided partial areas, and the color approximating means for each partial area hierarchized by the area dividing means
  • the pixel value indicating the color of the pixels constituting the partial area is approximated by a continuous function.
  • the edge existing in the raster image is detected, the edge detection means for determining the importance of the edge in the raster image, and the edge detected among the edges detected by the edge detection means Using the edge with the highest importance determined by the detection means, the raster image is divided into a plurality of partial areas, and edges having a lower importance than the edges are used in order, so that the plurality of partial areas are divided.
  • an area dividing means for hierarchizing the divided partial areas is provided, and the color approximating means configures the partial areas for each partial area hierarchized by the area dividing means. Since the pixel value indicating the color of the current pixel is approximated by a continuous function, the amount of image data used can be changed dynamically according to the scale, viewpoint switching, etc. It is possible to, the effect capable of random access to image data.
  • FIG. 1 is a block diagram showing an image vectorization apparatus according to Embodiment 1 of the present invention.
  • an edge detection unit 1 is composed of, for example, a semiconductor integrated circuit in which a CPU, MPU, or GPU (Graphics Processing Unit) is mounted, or a one-chip microcomputer, and an edge existing in a raster image. Is detected, the importance of the edge in the raster image is determined, and edge information indicating the edge and importance information indicating the importance of the edge are output to the region dividing unit 2.
  • the edge detector 1 constitutes an edge detector.
  • the area dividing unit 2 is composed of, for example, a semiconductor integrated circuit on which a CPU, MPU, or GPU is mounted, or a one-chip microcomputer.
  • the edge dividing unit 2 Use the edge with the highest importance indicated by the output importance information to divide the raster image into multiple partial areas, and use the edges that are less important than the edges in order to By repeating the process of dividing into a plurality of partial areas, a process of hierarchizing the divided partial areas is performed.
  • the area dividing unit 2 constitutes an area dividing means.
  • the color approximating unit 3 is composed of, for example, a semiconductor integrated circuit on which a CPU, MPU, or GPU is mounted, or a one-chip microcomputer. For each partial region hierarchized by the region dividing unit 2, the partial region is A process of approximating the pixel value indicating the color of the constituent pixels with a continuous function is performed.
  • the color approximating unit 3 constitutes color approximating means.
  • the vector image data storage unit 4 is composed of a storage device such as a RAM or a hard disk, for example.
  • the vector image data generated by the image vectorization device for example, region information indicating partial areas hierarchized by the region dividing unit 2) Or an area access table indicating the correspondence between the image coordinates of the raster image and the partial areas hierarchized by the area dividing unit 2.
  • FIG. 2 is a flowchart showing the processing contents (image vectorization method) of the image vectorization apparatus according to Embodiment 1 of the present invention.
  • FIG. 3 is an explanatory diagram showing edge detection processing and importance determination processing by the edge detection unit 1.
  • the edge detection processing and importance determination processing by the edge detection unit 1 will be described in detail.
  • the edge detection unit 1 When a raster image is input, the edge detection unit 1 reduces the raster image to a plurality of scales as shown in FIG. 3A before detecting an edge existing in the raster image. A plurality of reduced images are generated. In the example of FIG. 3A, a reduced image having a size that is 1 ⁇ 2 of a raster image and a reduced image having a size that is 1 ⁇ 4 are generated. When the raster image is reduced to a plurality of scales to generate a reduced image, the edge detection unit 1 detects edges existing in the plurality of scale images (including the original raster image).
  • an edge existing in a full-size raster image, an edge existing in a reduced image of 1/2 size, and an edge existing in a reduced image of 1/4 size are detected.
  • the edge detection can use an existing method such as the Canny method, for example.
  • the edge detection unit 1 determines the importance of the edge. For example, an edge that roughly captures the outline characteristics of an object appearing in the image is determined to be a highly important edge, and an edge that captures details of the object outline in the image is a low importance edge. Is determined.
  • the edge present in the reduced image of 1/4 size which is the smallest scale image is determined to be the edge having the highest importance.
  • importance 3 (the highest importance) is assigned to the edge.
  • the edges existing in the reduced image of 1 ⁇ 2 size are the edges that are not present in the reduced image of 1 ⁇ 4 size (edges not assigned importance 3) are The edge is determined to be an intermediate edge, and importance 2 (medium importance) is assigned to the edge.
  • the edges that are not present in the reduced images of 1/2 and 1/4 size Is determined to be the edge with the lowest importance, and importance 1 (the lowest importance) is assigned to the edge.
  • the highest importance level 3 is given to the edges detected from all the scale images, and the lowest importance level 1 is given to the edges detected only from the original size raster image.
  • the edge detection unit 1 performs the edge detection process and the importance determination process, the edge detection unit 1 obtains edge information indicating an edge existing in each scale image and importance information indicating the importance assigned to the edge.
  • the data is output to the area dividing unit 2.
  • the area dividing unit 2 Upon receiving edge information and importance information from the edge detection unit 1, the area dividing unit 2 refers to the edge information and importance information, divides the raster image into a plurality of partial areas (step ST3), The process of dividing the partial area into a plurality of partial areas is repeatedly performed (steps ST4 and ST5). That is, the region dividing unit 2 refers to the importance level information output from the edge detection unit 1 and identifies an edge of importance level 3 among the edges detected by the edge detection unit 1. The raster image is divided into a plurality of partial regions using the edges (step ST3).
  • an importance level 2 edge that is one level lower than the importance level 3 is identified, and the above-mentioned partial image (the importance level 3 edge) is used by using the importance level 2 edge.
  • an edge of importance 1 that is one less important than importance 2 is specified, and the partial image (importance 3, 2) is used using the edge of importance 1.
  • steps ST4 and ST5 are divided into a plurality of partial areas. Since there is no edge having a lower importance level than the importance level 1, the repetition process of steps ST4 and ST5 is terminated.
  • FIG. 4 is an explanatory diagram showing the area dividing process and the partial area hierarchizing process by the area dividing unit 2.
  • the region dividing unit 2 uses a “edge of importance 3” having the highest importance (see FIG. 4A) to perform a raster image dividing process, so that a plurality of raster images are obtained.
  • a “edge of importance 3” having the highest importance see FIG. 4A
  • R1, R2, R3, and R4 see FIG. 4B.
  • FIG. 5 is an explanatory diagram showing a region dividing method of the region dividing unit 2.
  • the area dividing unit 2 obtains inflection points, branch points (including intersection points), and end points of edges existing in the image.
  • branch points including intersection points
  • end points of edges existing in the image In the example of FIG. 5, the end point of the edge (1), the intersection of the edge (1) and the edge (2), and the inflection point of the edge (2) are obtained.
  • the area dividing unit 2 obtains area dividing lines passing through horizontal or vertical edges existing in the image, and divides the image by the area dividing lines.
  • the edge (1) which is a horizontal edge exists in the image since the edge (1) which is a horizontal edge exists in the image, the area dividing line passing through the edge (1) is obtained. In the example of FIG. 5, no vertical edge exists in the image.
  • the area dividing unit 2 obtains a horizontal or vertical area dividing line passing through the end points of the edges, and divides the image by the area dividing line.
  • a vertical area dividing line passing through the end point of the edge (1) is obtained.
  • the horizontal area dividing line passing through the end point of the edge (1) overlaps with the area dividing line in FIG.
  • the area dividing unit 2 obtains a horizontal or vertical area dividing line passing through the intersection of the edges, and divides the image by the area dividing line.
  • a vertical area dividing line passing through the intersection of the edge (1) and the edge (2) is obtained.
  • the horizontal area dividing line passing through the intersection of the edge (1) and the edge (2) overlaps with the area dividing line in FIG.
  • the region dividing unit 2 obtains a horizontal or vertical region dividing line passing through the inflection point of the edge, and divides the image by the region dividing line.
  • the horizontal area dividing line passing through the inflection point of the edge (2) and the vertical area dividing line are obtained. Yes.
  • the area dividing unit 2 repeatedly performs the operations shown in FIGS. 5B to 5E until the number of curved edges becomes one in each rectangular partial area that is an image after division. In the operations shown in FIGS. 5B to 5E, when a new area dividing line is newly added, the existing area dividing line may be crossed or may not be crossed.
  • the region dividing unit 2 obtains an approximate curve by approximating a diagonal line or a curve included in the partial region to a quadratic curve or a cubic curve.
  • the internal line information indicating the type of function indicating the approximate curve and the parameter of the function is set as one of the area information (attribute information) of the partial area.
  • the types of diagonal lines and curves included in the rectangular partial region are not limited, but the region division may be performed by limiting the types of curves to a certain pattern.
  • FIG. 6 is an explanatory diagram showing another area dividing method of the area dividing unit 2.
  • the image may be divided into a plurality of partial areas using inflection points, branch points (including intersections), and end points of edges existing in the image.
  • the image may be divided into a plurality of partial regions using any one of the region pattern (1), the region pattern (2), and the region pattern (3) as shown in A).
  • the area pattern (1) is an area not including a dividing line
  • the area pattern (2) is an area including a dividing line described by a curve or a straight line passing through the upper right corner and the lower left corner
  • the area pattern (3) is an area including a dividing line described by a curve or a straight line passing through the upper left corner and the lower right corner.
  • the region dividing unit 2 matches two edges (1) and edge (2) when two edges (1) and edge (2) exist in the image. Then, for each small region that is a partial region, one of the region pattern (1), the region pattern (2), and the region pattern (3) is selected (for example, a small region that does not have edges (1) and (2)).
  • the area pattern (1) is selected, and in the small area where the edge (2) exists, the area pattern (2) or the area pattern (3) is selected according to the direction of the curve or the straight line).
  • the region patterns of the selected small regions By combining the region patterns of the selected small regions, a region division result as shown in FIG. 6C is obtained.
  • the types of curves in each partial region can be limited, so that the advantage of reducing the number of parameters of the curve can be obtained.
  • the area dividing unit 2 uses the “importance 3 edge” having the highest importance (see FIG. 4A) to divide the raster image into partial areas R1, R2, R3, and R4. Using the “importance 2 edge” having a high importance (see FIG. 4A), the partial area is divided into a plurality of partial areas by performing the division process (see FIG. 4). (See (B)). In the example of FIG. 4, since an edge of importance 2 exists in the partial region R4, the partial region R4 is divided into partial regions R5, R6, and R7.
  • the region dividing unit 2 uses the “edge of importance 2” (see FIG. 4A) to divide the partial region R4 into the partial regions R5, R6, and R7, and the “importance” having the lowest importance is obtained.
  • the edge of “1” see FIG. 4A
  • the partial area is divided to be divided into a plurality of partial areas (see FIG. 4B).
  • the edge of importance 1 crosses the partial regions R1, R2, and R3
  • the partial region R1 is divided into the partial region R8 and the partial region R11
  • the partial region R2 is divided into the partial region R9 and the partial region.
  • the partial region R3 is divided into a partial region R10 and a partial region R13.
  • the area dividing unit 2 performs a hierarchizing process on the divided partial areas R1 to R13 (step ST6).
  • the partial areas R1, R2, R3, and R4 divided using only the “importance 3 edge” having the highest importance belong to the hierarchy (1) (the highest hierarchy).
  • the hierarchy number indicating the hierarchy (1) is assigned to the partial regions R1, R2, R3, and R4.
  • the subregions R5, R6, and R7 divided by adding the “edge of importance 2” are defined as belonging to the hierarchy (2) (middle hierarchy), and the hierarchy number indicating the hierarchy (2) Is applied to the partial regions R5, R6, and R7.
  • the partial areas R8, R9, R10, R11, R12, and R13 divided by adding the “edge of importance 1” are defined as belonging to the hierarchy (3) (the lowest hierarchy), and the hierarchy ( 3) is assigned to the partial areas R8, R9, R10, R11, R12, and R13.
  • the region dividing unit 2 assigns the layer numbers to the partial regions R1 to R13, the partial regions R1 to R13 belonging to the layers (1) to (3) are linked to each other between the corresponding partial regions.
  • the partial regions R5, R6, and R7 are regions divided from the partial region R4, there is a correspondence between the partial regions R5, R6, and R7 and the partial region R4, as shown in FIG.
  • the partial region R4 and the partial regions R5, R6, and R7 are linked.
  • partial areas R8, R11 are areas divided from the partial area R1, there is a correspondence between the partial areas R8, R11 and the partial area R1, and as shown in FIG.
  • the region R1 and the partial regions R8 and R11 are linked.
  • the partial regions R9 and R12 are regions divided from the partial region R2, there is a correspondence between the partial regions R9 and R12 and the partial region R2, and as shown in FIG.
  • the partial area R2 and the partial areas R9 and R12 are linked.
  • the partial regions R10 and R13 are regions divided from the partial region R3, there is a correspondence between the partial regions R10 and R13 and the partial region R3, and as shown in FIG.
  • the partial area R3 and the partial areas R10 and R13 are linked.
  • the color approximating unit 3 displays, for each partial area hierarchized by the area dividing unit 2, a pixel value indicating the color of the pixel constituting the partial area Is approximated by a continuous function.
  • FIG. 7 is an explanatory diagram showing attribute setting processing such as a side surrounding a partial region and sampling processing of pixel values by the color approximation unit 3.
  • FIG. 8 is an explanatory diagram showing a local coordinate system (U, V) of sampling points.
  • U, V local coordinate system
  • the color approximating unit 3 sets the attribute of the side surrounding the partial region to “continuous side” or “discontinuous side” (step ST7). That is, the color approximating unit 3 determines whether or not an edge exists on the side surrounding the partial region, and sets the attribute of the side on which the edge exists to “discontinuous side”. On the other hand, the attribute of the side where no edge exists is set to “continuous side”. For example, when attention is paid to the partial region R1 shown in FIG. 4, since there is an edge at the boundary between the partial region R1 and the partial region R2 (see FIGS. 4A and 4B), there is an edge on the right side of the partial region R1.
  • the side attribute surrounding the partial area is set by the presence or absence of the edge, but the side attribute may be set based on the continuity of the color near the side surrounding the partial area. That is, the color approximating unit 3 determines the difference between the pixel value inside the partial area (the pixel value indicating the color of the pixel constituting the partial area) and the pixel value in the peripheral area (adjacent area) of the partial area. If the difference between the pixel values is equal to or greater than a predetermined value, the attribute of the side surrounding the partial area is set to “discontinuous side”. On the other hand, if the difference between the pixel values is less than the predetermined value, the attribute of the side surrounding the partial area is set to “continuous side”. As a method for determining the continuity of pixel values, for example, the Euclidean distance of a set of pixel values around adjacent sides may be calculated.
  • the attribute of the upper side and the right side of the partial region Z1 is set to “discontinuous side”.
  • the attribute of the lower and left sides of the partial region Z1 is set to “continuous side”.
  • the color approximating unit 3 always sets the attribute of the diagonal line or curve included in each partial region to “discontinuous side”.
  • the color approximating unit 3 determines the pixel value to be sampled according to the setting state of the attribute (step ST8). .
  • the attributes of the upper and right sides of the partial area Z1 are “discontinuous sides”, and the attributes of the lower and left sides are “continuous sides”.
  • the partial area Z1 includes a diagonal line L, and the attribute of the diagonal line L is “discontinuous side”. For this reason, the partial region Z1 is divided into an upper part and a lower part by an oblique line L.
  • the lower part of the partial area Z1 Since the upper part of the partial area Z1 is surrounded by a line having the attribute of “discontinuous side”, when approximating the pixel value indicating the color of the pixels constituting the upper part with a continuous function, the lower part of the partial area Z1 In addition, it is desirable not to mix with the colors in the upper and right areas of the partial area Z1. For this reason, only the pixel value of the pixel constituting the upper part is determined as the pixel value to be sampled. Specifically, the pixel value at the position indicated by “ ⁇ ” in FIG. 7B is determined as a sampling target.
  • the attribute of the lower side and the left side of the lower part of the partial area Z1 is “continuous side”, when the pixel value indicating the color of the pixels constituting the lower part is approximated by a continuous function, the lower and left sides of the partial area Z1 It is desirable to change smoothly between colors in the region. For this reason, the pixel values of the pixels constituting the lower part and the partial pixel values in the lower and left areas of the partial area Z1 are determined as sampling target pixel values. Specifically, the pixel value at the position indicated by “ ⁇ ” in FIG. 7B is determined as a sampling target.
  • the color approximation unit 3 determines the pixel value to be sampled for each partial region hierarchized by the region dividing unit 2, the color approximation unit 3 samples the pixel value to be sampled and approximates the pixel value with a continuous function ( Step ST9).
  • a local coordinate system (U, V) of a surface having point 1 as the origin (0, 0) is defined within a certain region.
  • u and v are real parameters between “0” and “1”
  • the color and luminance information of the point (u, v) sampled by the color approximating unit 3 can be approximated by an arbitrary parametric function F (u, v).
  • a Bezier curved surface or “Ferguson patch” described in Non-Patent Document 1 can be used.
  • a sigmoid function F (u, v) that can express a steep edge can be used.
  • F (u, v) 1 / (1 + exp (a ⁇ u + b ⁇ v + c))
  • a, b, and c are constants and may be arbitrarily specified by the user.
  • FIG. 9 is an explanatory diagram showing a configuration example of vector image data.
  • the vector image data includes area information and an area access table.
  • the area information is generated for each partial area belonging to each hierarchy, and the area access table is generated for each hierarchy. For example, when there are M layers (1) to (M), M area access tables are generated.
  • M M area access tables are generated.
  • the area information for example, as shown in FIG. 4, there are 4 partial areas belonging to the hierarchy (1), 3 partial areas belonging to the hierarchy (2), and 6 partial areas belonging to the hierarchy (3). In some cases, four area information is generated in the hierarchy (1), three area information is generated in the hierarchy (2), and six area information is generated in the hierarchy (3).
  • the area information is information composed of boundary line information, boundary line attribute information, color information, link area information, and internal line information.
  • the boundary line information is coordinate information generated by the area dividing unit 2, and this boundary line information includes a boundary line (a side surrounding the partial area, an oblique line or a curve included in the partial area). This is information indicating the position.
  • the boundary line attribute information is attribute information generated by the region dividing unit 2, and the boundary line attribute information is information indicating the attribute of the boundary line. For example, it indicates whether a side, a curve, or the like surrounding the partial region is a “continuous side” or a “discontinuous side”.
  • the color information is information relating to the color generated by the color approximating unit 3, and this color information is information indicating the type of continuous function approximating the pixel value of the partial area and the parameters of the continuous function.
  • the link area information is information indicating the hierarchical relationship of the partial areas generated by the area dividing unit 2, and this link area information is an area for identifying a partial area of an upper hierarchy or a lower hierarchy linked to a certain partial area. This is information indicating a number, a hierarchy number indicating a hierarchy to which each partial area belongs, and the like.
  • the internal line information is information related to the internal line generated by the area dividing unit 2, and the internal line information includes the type of function indicating the oblique line or the approximate curve of the curve included in the partial area, and the function Information indicating parameters.
  • the area access table is a look-up table generated by the area dividing unit 2, and this area access table shows the correspondence between the image coordinates of the raster image and the partial areas hierarchized.
  • FIG. 10 is an explanatory diagram of a configuration example of the area access table.
  • an external device for example, a car navigation device
  • the vector image data stored in the vector image data storage unit 4 is randomly accessed by referring to the area access table. A method for acquiring the pixel value will be described.
  • a 9 ⁇ 9 pixel raster image is divided into partial regions R1, R2, R3, and R4 as shown in FIG.
  • a minimum size table region access table
  • a region number indicating the partial region is substituted into the minimum size table.
  • the minimum size for maintaining the ratio of each partial area is 3 ⁇ 3 pixels (the size of the original raster image reduced to 1/3). Numbers 1 to 4 are assigned.
  • the external device rounds the image coordinates (6, 5) to 1/3, which is the ratio of the area access table to the raster image, and calculates the rounded value (2, 2).
  • the rounded value (2, 2) is used as the coordinate value of the area access table, and the coordinates are set to the coordinates (2, 2) from the area access table shown in FIG.
  • the assigned area number “4” is acquired.
  • the region information for example, color information, internal line
  • Information, etc. is read out, and the color of each pixel constituting the partial area (a part of the raster image including the image coordinates (6, 5)) is determined and drawn according to the area information.
  • the edge detection unit 1 that detects an edge existing in a raster image and determines the importance of the edge in the raster image, and edge detection Among the edges detected by the section 1, the edge having the highest importance determined by the edge detection section 1 is used to divide the raster image into a plurality of partial areas, and the importance is lower than that edge.
  • an area dividing unit 2 for hierarchizing the divided partial areas is provided, and the color approximating unit 3 is an area dividing unit For each partial area hierarchized by 2, the pixel value indicating the color of the pixels constituting the partial area is approximated by a continuous function. It is possible to dynamically change the used data amount of over data, an effect that may be random access to image data.
  • an external device for example, a car navigation device
  • map at a certain scale or viewpoint For example, area information necessary for drawing can be dynamically changed.
  • any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
  • the present invention is suitable for an image vectorization apparatus that needs to be able to draw a desired image even in a car navigation apparatus that has a small amount of available memory.
  • 1 edge detection unit edge detection means
  • 2 region division unit region division unit
  • 3 color approximation unit color approximation unit
  • 4 vector image data storage unit 4 vector image data storage unit.

Abstract

L'invention porte sur un dispositif de vectorisation d'image qui comprend : une unité de détection de contours (1) pour détecter des contours présents dans une image tramée et identifier les degrés d'importance des contours dans l'image tramée ; et une unité de division en régions (2) pour diviser l'image tramée en une pluralité de régions partielles par utilisation des contours ayant le plus haut degré d'importance identifié par l'unité de détection de contours (1) parmi les contours détectés par l'unité de détection de contours (1), répéter un processus de division des régions partielles en une pluralité d'autres régions partielles par utilisation séquentielle des contours ayant un degré d'importance plus bas que les contours ayant le plus haut degré d'importance, et ainsi hiérarchiser les régions partielles après division. Une unité d'approximation de couleur (3) calcule une approximation, pour chacune des régions partielles hiérarchisées par l'unité de division en régions (2), des valeurs de pixel indiquant les couleurs de pixels constituant les régions partielles par utilisation de fonctions continues.
PCT/JP2011/001106 2011-02-25 2011-02-25 Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image WO2012114386A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/001106 WO2012114386A1 (fr) 2011-02-25 2011-02-25 Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/001106 WO2012114386A1 (fr) 2011-02-25 2011-02-25 Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image

Publications (1)

Publication Number Publication Date
WO2012114386A1 true WO2012114386A1 (fr) 2012-08-30

Family

ID=46720209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/001106 WO2012114386A1 (fr) 2011-02-25 2011-02-25 Dispositif de vectorisation d'image, procédé de vectorisation d'image et programme de vectorisation d'image

Country Status (1)

Country Link
WO (1) WO2012114386A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112570A (zh) * 2021-05-12 2021-07-13 北京邮电大学 一种基于感知驱动的矢量化效果评价方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107294A (ja) * 1993-09-30 1995-04-21 Toshiba Corp 画像符号化装置
JPH0927966A (ja) * 1995-07-12 1997-01-28 Sanyo Electric Co Ltd 画像符号化方法および画像符号化装置
JPH09200750A (ja) * 1996-11-08 1997-07-31 Sony Corp データ伝送方法
JP2004023370A (ja) * 2002-06-14 2004-01-22 Ikegami Tsushinki Co Ltd 画像符号化方法及びその装置
JP2008147880A (ja) * 2006-12-07 2008-06-26 Nippon Telegr & Teleph Corp <Ntt> 画像圧縮装置と方法及びそのプログラム
JP2009111649A (ja) * 2007-10-29 2009-05-21 Sony Corp 情報符号化装置および方法、情報検索装置および方法、情報検索システムおよび方法、並びにプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107294A (ja) * 1993-09-30 1995-04-21 Toshiba Corp 画像符号化装置
JPH0927966A (ja) * 1995-07-12 1997-01-28 Sanyo Electric Co Ltd 画像符号化方法および画像符号化装置
JPH09200750A (ja) * 1996-11-08 1997-07-31 Sony Corp データ伝送方法
JP2004023370A (ja) * 2002-06-14 2004-01-22 Ikegami Tsushinki Co Ltd 画像符号化方法及びその装置
JP2008147880A (ja) * 2006-12-07 2008-06-26 Nippon Telegr & Teleph Corp <Ntt> 画像圧縮装置と方法及びそのプログラム
JP2009111649A (ja) * 2007-10-29 2009-05-21 Sony Corp 情報符号化装置および方法、情報検索装置および方法、情報検索システムおよび方法、並びにプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TORU MIYAKOSHI ET AL.: "A Segmentation Method for Real Images using Quadratic Curved Line Units", IEICE TECHNICAL REPORT, vol. 103, no. 642, 26 January 2004 (2004-01-26), pages 13 - 18 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112570A (zh) * 2021-05-12 2021-07-13 北京邮电大学 一种基于感知驱动的矢量化效果评价方法
CN113112570B (zh) * 2021-05-12 2022-05-20 北京邮电大学 一种基于感知驱动的矢量化效果评价方法

Similar Documents

Publication Publication Date Title
US11922534B2 (en) Tile based computer graphics
JP7004759B2 (ja) 複数のレンダーターゲット内でアクティブカラーサンプルカウントを変更することによりスクリーンの位置によって有効解像度を変動させること
JP6678209B2 (ja) 非正規直交グリッドへのテクスチャマッピングのためのグラデーションの調整
JP6563048B2 (ja) スクリーンの位置によって異なる解像度のターゲットの複数レンダリングのテクスチャ・マッピングの傾き調整
TWI578266B (zh) 藉由近似頂點至彎曲視埠上的投影在圖形處理中隨螢幕位置變化有效解析度
TWI584223B (zh) 藉由追蹤物件及/或基元識別符的圖形處理增強之方法及系統,圖形處理單元及非暫時性電腦可讀媒體
US7884825B2 (en) Drawing method, image generating device, and electronic information apparatus
TW201539374A (zh) 用於高解析度顯示緩衝器之有效構造之方法
JP2005100177A (ja) 画像処理装置およびその方法
EP4094231A1 (fr) Optimisation de maillage pour infographie
US9721187B2 (en) System, method, and computer program product for a stereoscopic image lasso
RU2680355C1 (ru) Способ и система удаления невидимых поверхностей трёхмерной сцены
US20150015574A1 (en) System, method, and computer program product for optimizing a three-dimensional texture workflow
US10347034B2 (en) Out-of-core point rendering with dynamic shapes
WO2012114386A1 (fr) Dispositif de vectorisation d&#39;image, procédé de vectorisation d&#39;image et programme de vectorisation d&#39;image
US10062191B2 (en) System and method for rendering points without gaps
US11869123B2 (en) Anti-aliasing two-dimensional vector graphics using a compressed vertex buffer

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11859616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11859616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP