US6731801B2 - Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon - Google Patents

Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon Download PDF

Info

Publication number
US6731801B2
US6731801B2 US10/369,809 US36980903A US6731801B2 US 6731801 B2 US6731801 B2 US 6731801B2 US 36980903 A US36980903 A US 36980903A US 6731801 B2 US6731801 B2 US 6731801B2
Authority
US
United States
Prior art keywords
pixel
image
connecting region
region
index
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US10/369,809
Other versions
US20030156757A1 (en
Inventor
Akira Murakawa
Keisuke Hashimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Minolta Co Ltd
Original Assignee
Minolta Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP11092094A external-priority patent/JP2000285240A/en
Priority claimed from JP11092050A external-priority patent/JP2000285237A/en
Application filed by Minolta Co Ltd filed Critical Minolta Co Ltd
Priority to US10/369,809 priority Critical patent/US6731801B2/en
Publication of US20030156757A1 publication Critical patent/US20030156757A1/en
Application granted granted Critical
Publication of US6731801B2 publication Critical patent/US6731801B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/457Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by analysing connectivity, e.g. edge linking, connected component analysis or slices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors

Definitions

  • the present invention relates to an image processing technique for recognizing a specific pattern included in an image.
  • an image processing apparatus the position of a specific pattern included in an image has been detected and a pattern matching processing has been carried out to recognize the shape of the specific pattern by storing data on each pixel fetched into a camera or the like as binary data or multivalued data in an image memory and matching, for each pixel, the stored data with data on each pixel of a reference pattern stored beforehand in the memory.
  • this matching the whole image has been scanned while shifting the reference pattern every pixel for an image including the specific pattern in the x or y direction of the image.
  • examples of a method for detecting the position of the specific pattern of an image implementing a high speed include a method for creating a reduced image from the image, performing a pattern matching on the reduced image to retrieve the specific pattern, and carrying out retrieval in the region of an original image corresponding to the detecting position of the specific pattern on the reduced image again.
  • the specific pattern cannot be detected accurately depending on an algorithm for creating the reduced image in some cases. For example, in the case of the image having a pixel value which is alternately different for each pixel such as checkers, when thinning is carried out every other pixel to create a reduced image, the reduced image thus created has only one of pixel values appearing on an original image. Consequently, a correct specific pattern cannot be detected from such a reduced image.
  • examples of a process to be carried out prior to a process such as image recognition include a labeling process of detecting a region to be formed by pixels having the same value which are continuous in a binary image (hereinafter referred to as a “connecting region”) and giving a label number to the connecting region.
  • a labeling process of detecting a region to be formed by pixels having the same value which are continuous in a binary image (hereinafter referred to as a “connecting region”) and giving a label number to the connecting region.
  • FIG. 21 is a diagram illustrating one example of the labeling processes in the connecting region of the image according to the prior art.
  • an original image 20 includes three connecting regions A, B and C. These connecting regions A, B and C are continuous regions comprising pixels having pixel value of “1”.
  • the whole image is scanned to examine the pixel value of each pixel and to give a provisional label number for a pixel decided to constitute the connecting region.
  • the provisional label number is not always accurate. Therefore, re-labeling of the label number is carried out later in some cases. For this reason, information for the re-labeling is stored in tables 51 and 53 . Then, an image is scanned again to replace the provisional label number with a real label number with reference to the tables 51 and 53 .
  • a pixel having a pixel value of “1” is detected during the scanning of the image 20 in a first scanning operation, it is decided whether or not there are any adjacent pixels having a label number given thereto. If there is an adjacent pixel having a label given thereto, the same label number as in the adjacent pixel is given to the detected pixel. When a plurality of adjacent pixels have the label numbers given thereto, the same label number as in the adjacent pixel having a higher priority is given based on a predetermined priority. When there is no adjacent pixel having the label number given thereto, a new label number is given to the detected pixel. Thus, the inside of the image is scanned to give the label number of the pixel having the pixel number of “1”.
  • a connecting region A of the image 20 is labeled as three regions having label numbers 2 , 3 and 4 in an image 20 ′ as a result of the above-mentioned labeling.
  • a link table 51 is created.
  • the link table 51 stores information indicating that three regions having the label numbers 2 , 3 and 4 are identical.
  • label editing is carried out to create a conversion table 53 for setting the label numbers 2 , 3 and 4 to 2 and re-labeling the label numbers 3 and 4 thus generating a vacancy into a region having label numbers 5 and 6 .
  • each of the pixels of the image 20 ′ is scanned and a label number is converted (re-labeled) into a real label number by referring to the information of the conversion table 53 . Consequently, each region is finally labeled correctly as in an image 20 ′′.
  • the present invention is directed to solve the above-mentioned problem, and has an object to provide an image processing apparatus and method capable of retrieving a specific pattern from an image at a high speed and a recording medium storing a program therefor.
  • the invention has another object to provide an image processing apparatus and method for a labeling processing in the connecting region of an image at a high speed, and a recording medium storing a program therefor.
  • an image processing apparatus for retrieving a specific pattern of an image.
  • the apparatus comprises a unit for dividing the image into a plurality of block regions, each of the block regions having a predetermined size, a unit for creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one, a unit for extracting features of the image for each of the block regions, a unit for deciding whether or not the extracted feature meets a predetermined condition, a unit for labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition, a unit for detecting a predetermined pattern for specifying a search region on the image from the labeled map image, a unit for specifying the search region based on a position on the map image in which the predetermined pattern is detected, and a unit for retrieving the specific pattern in the specified search region on the image.
  • the apparatus at least one of a concentration value of a pixel, an edge amount and a co-occurrence characteristic may be extracted as the feature amount.
  • the predetermined pattern for determining the search region may be detected by using a filter having the same size as a size of the specific pattern in the map image.
  • an image processing method for retrieving a specific pattern of an image.
  • the method comprises dividing the image into a plurality of block regions, each of the block regions having a predetermined size, creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one, extracting features of the image for each of the block regions, deciding whether or not the extracted feature meets a predetermined condition, labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition, detecting a predetermined pattern for specifying a search region on the image from the labeled map image, specifying the search region based on a position on the map image in which the predetermined pattern is detected, and retrieving the specific pattern in the specified search region on the image.
  • a computer readable recording medium storing a program for controlling the computer to retrieve a specific pattern from an image.
  • the program is capable of executing the following functions: a function for dividing the image into a plurality of block regions, each of the block regions having a predetermined size; a function for creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one; a function for extracting features of the image for each of the block regions; a function for deciding whether or not the extracted feature meets a predetermined condition; a function for labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition; a function for detecting a predetermined pattern for specifying a search region on the image from the labeled map image; a function for specifying the search region based on a position on the map image in which the predetermined pattern is detected; and a function for retrieving the specific pattern in
  • an image processing apparatus for labeling a connecting region of an image.
  • the apparatus comprises a flag memory including flags indicating whether or not each pixel of the image is finished with examination, a candidate stack for storing indexes, each of indexes specifying a pixel of which adjacent pixels are to be examined, a region memory for storing indexes, each of indexes indicating one of pixels constituting the connecting region, a labeling unit for labeling pixels specified by the indexes stored in the region memory, and a connecting region detecting unit for detecting the pixels constituting the connecting region.
  • the connecting region detecting unit scans the image, examines whether or not a pixel of the image constitutes the connecting region, sets the flag for the examined pixel.
  • the connecting region detecting unit (1) pushes an index of the detected pixel into the candidate stack, and stores the index to the region memory, 2) while there is at least one index stored in the candidate stack, repeats a) popping an index from the candidate stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the candidate stack and storing the indexes to the region memory, and c) setting the flags for the adjacent pixels.
  • the labeling unit may calculate a size of a region including pixels indicated by indexes stored in the memory when labeling pixels, and prohibit labeling the pixel when the calculated size is out of a predetermined range.
  • the region memory may comprise a stack memory.
  • an image processing method for labeling a connecting region of an image.
  • the method comprises scanning the image, examining whether or not a pixel of the image constitutes the connecting region, setting the flag for the examined pixel, and when the pixel constituting the connecting region is first detected in the examination, (1) pushing an index of the detected pixel into a stack, and storing the index to a memory, (2) while there is at least one index stored in the stack, repeating a) popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) setting the flags for the adjacent pixels, (3) reading all indexes out from the memory when there is no index stored in the stack to label pixels indicated by the read indexes with same number.
  • a computer readable recording medium in which a program for controlling the computer to label a connecting region of an image is provided.
  • the program is capable of executing the following functions: a function for scanning the image; a function for examining whether or not a pixel of the image constitutes the connecting region; a function for setting the flag for the examined pixel; and when the pixel constituting the connecting region is first detected in the examination, (1) a function for pushing an index of the detected pixel into a stack, and storing the index to a memory, (2) while there is at least one index stored in the stack, repeating a) a function for popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, a function for pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) a function for setting the flags for the adjacent pixels, (3)
  • FIG. 1 is a diagram schematically showing the structure of an image processing apparatus according to the present invention.
  • FIG. 2 is a block diagram of the image processing apparatus, mainly showing a controller.
  • FIG. 3 is a flowchart showing a main routine in the image processing apparatus according to a first embodiment.
  • FIG. 4 is a diagram showing an object image (an original image) from which a specific pattern is retrieved and a map image corresponding thereto.
  • FIG. 5 is a flowchart showing an image (pattern) recognition process.
  • FIG. 6 is a diagram illustrating the scanning by a map filter on a map image.
  • FIG. 7 is a diagram showing an example of the map filter.
  • FIG. 8 is a diagram showing an example of an image filter.
  • FIG. 9 is a flowchart showing the detecting process on an image.
  • FIG. 10 is a flowchart showing the detecting process on an original image.
  • FIG. 11 is a diagram illustrating a correspondence of a position A specified on the map image and a search region A′ (scanning range) on the original image.
  • FIG. 12 is a flowchart showing a main routine in the image processing apparatus according to a second embodiment.
  • FIG. 13 is a diagram illustrating the scanning of a connecting region.
  • FIG. 14 is a diagram illustrating a state of expansion of adjacent pixels to be examined.
  • FIG. 15 is a diagram showing an examination-end flag, a next examination candidate stack and a connecting region stack which are logically constituted on a RAM.
  • FIG. 16 is a flowchart showing a labeling process of the connecting region.
  • FIG. 17 is a flowchart showing a connecting region detecting process.
  • FIG. 18A is a diagram showing an image including the connecting region comprising pixels with values of “1”.
  • FIG. 18B is a diagram showing the index of each pixel of an image including the connecting region.
  • FIG. 19 is a diagram illustrating pixels adjacent to the pixel to be examined.
  • FIGS. 20A to 20 H are diagrams illustrating a transition of the state of the examination-end flag, the next examination candidate stack and the connecting region stack.
  • FIG. 21 is a diagram illustrating a labeling process in a connecting region according to the prior art.
  • An image processing apparatus serves to retrieve a specific pattern in an image.
  • FIG. 1 is a diagram showing the schematic structure of an image processing apparatus according to the present embodiment (which will be hereinafter referred to as a “system”).
  • the system comprises a central processing unit (which will be hereinafter referred to as a “CPU”), and is mainly constituted by a controller 1 for controlling the whole system.
  • CPU central processing unit
  • controller 1 for controlling the whole system.
  • Pentium manufactured by Intel Co., Ltd. or the like is used for the CPU.
  • a display 2 for performing the display of images, characters or the like, the display for operations and the like, a keyboard 3 and a mouse 4 for carrying out various inputs, indicating operations and the like, a floppy disk drive 5 a and a hard disk drive 6 which are data saving media, a printer 7 for printing characters, image data and the like, a scanner 8 for fetching image data, a CD-ROM drive 9 b for reading data stored in a CD-ROM 9 a , a speaker 10 for outputting a voice, and a microphone 11 for inputting a voice.
  • FIG. 2 is a block diagram showing the present system.
  • a ROM 203 in which a program for controlling the system is stored and a RAM 204 for temporarily storing a program and data to be executed for control by the CPU 201 are connected to the CPU 201 through a data bus 220 .
  • a circuit to be connected to the CPU 201 through the data bus 220 includes a display control circuit 205 for controlling the display 2 for displaying images, characters or the like, a keyboard control circuit 206 for transferring and controlling an input from the keyboard 3 , a mouse control circuit 207 for transferring and controlling an input from the mouse 4 , a floppy disk drive control circuit 208 for controlling the floppy disk drive 5 b , a hard disk drive control circuit 209 for controlling the hard disk drive 6 , a printer control circuit 210 for controlling an output to the printer 7 , a scanner control circuit 211 for controlling the scanner 8 , a CD-ROM drive control circuit 212 for controlling the CD-ROM drive 9 b , a speaker control circuit 213 for controlling the speaker 10 , and a microphone control circuit 214 for controlling the microphone 11 .
  • a display control circuit 205 for controlling the display 2 for displaying images, characters or the like
  • a keyboard control circuit 206 for transferring and controlling an input from the keyboard 3
  • a clock 202 for generating a reference clock necessary for operating the system is connected to the CPU 201 , and an expansion slot 215 for connecting various extension boards is connected to the CPU 201 through the data bus 220 .
  • a SCSII board is connected to the expansion slot 215 , and the floppy disk unit 5 b , the hard disk drive 6 , the scanner 8 , the CD-ROM drive 9 b or the like may be connected through the SCSII board.
  • floppy disk drive 5 a and the hard disk drive 6 have been used as the data storage media in the above-mentioned system, other information recording media such as a magneto-optic disk (MO) and the like may be used.
  • the scanner 8 is used as the image data input device, other data input devices such as a steel video camera, a digital camera and the like may be used.
  • the printer 7 has been used as an output device, other output devices such as a digital printer may be used.
  • a program for implementing a data management system is stored in the ROM 203 .
  • the programs may be stored in the information recording media such as the floppy disk drive 5 a , the hard disk drive 6 , the CD-ROM drive 9 b and the like, and a program and data may be read out from the information recording media to the RAM 204 if necessary and may be executed.
  • FIG. 3 is a flowchart showing a main routine of a program to be executed by the CPU 201 in the system.
  • the system CPU 201 ) sequentially executes each process according to user's operation.
  • an initializing process for initializing a flag or the like necessary for each of the subsequent process and for displaying an initial menu screen and the like is carried out (S 1 ).
  • Icons for selecting a process such as “IMAGE RECOGNITION” “OTHERS” and the likes are displayed on an initial menu screen.
  • a process corresponding to the icon is executed.
  • the routine proceeds to an image recognizing process (S 3 ) for performing a process of confirming the presence of a pattern having a specific layout (referred to as “layout pattern”) in an input image. Then, the routine returns to the step S 2 .
  • “OTHERS” is selected, another process (S 4 ) which is a predetermined process other than the image recognition is carried out. Then, the routine returns to the step S 2 .
  • “END” is selected, a predetermined post-process (S 5 ) for terminating the operation of the system is carried out to end the control. If nothing is selected at the step S 2 , the user's selection of any of the process is waited.
  • step S 4 Since another process (step S 4 ) and a post-process (step S 5 ) are basically the same as well-known process to be used in a general information processing system, their description will be omitted.
  • the image recognition process (step S 3 ) will be described later in detail.
  • a map image 22 which is an image indicative of a region in the image 20 having feature that is coincide exactly or approximately with a part of or whole specific patterns.
  • the map image 22 is created such that one pixel corresponds to one block obtained by dividing the whole region of the image 20 into some block regions having a predetermined size (8 ⁇ 8 pixels). Accordingly, the size of the map image 22 is an eighth of the original image 20 in the x and y directions in FIG. 4 .
  • the system uses the map image 22 to retrieve the specific pattern in the following manner.
  • the system extracts a feature of the image 20 for each block region, examines the feature for each block region and records the result on the pixel of the map image 22 corresponding to the block region of the image 20 (labeling). Then, the system presumes a region where the specific pattern on the image 20 is present based on the pixel pattern formed by the labeling on the map image 22 , and examines the presumed region on the image 20 in detail, thereby retrieving the specific pattern.
  • specifying the retrieval region by using the map 22 having a small size reduces a processing time required for specifying the retrieval region. Therefore the specific pattern retrieval can be carried out at a high speed. This process will be described below with reference to a flowchart shown in FIG. 5 .
  • image data of the image 20 is input (S 301 ).
  • the image 20 is an object to be retrieved with the specific pattern. More specifically, the image data of the image 20 designated by the user is loaded into the RAM 204 .
  • the input image 20 is divided into some block regions each having a predetermined size. The block size can be set to an optional size (m ⁇ n pixels).
  • the map image 22 is created (S 302 ). More specifically, a working region on the RAM 204 for the map image 22 is kept and all the pixel values of the map image 22 are initialized to a constant value.
  • Sx pixels ⁇ Sy pixels Sx/m pixels ⁇ (Sy/n) pixels is obtained as the size of the map image.
  • the block size of 8 pixels ⁇ 8 pixels will be described.
  • a pointer i indicative of a block region to be processed is set to 1 (S 303 ).
  • the feature of the image is calculated for the ith block region indicated by the pointer i (S 304 ).
  • the feature of the ith block region is obtained in the following manner.
  • the feature is obtained for each pixel in the block region, and the pixel is labeled when the amount of feature is within a predetermined range.
  • the number of the labeled pixels in the block region is counted as the feature amount in the block.
  • the feature includes a) the number of pixels within a specific range in a color space, b) the strength of an edge, c) the co-occurrence characteristic obtained by conversion to a binary image and the like.
  • a pixel in a block region which has the same color as the specific pattern is labeled.
  • the labeled pixels are counted and the number thereof is set as the feature amount of the block region.
  • a block region having the feature amount within a predetermined error range can be a candidate region in which the specific pattern is to be searched.
  • the specific pattern has an edge which is uniform within a certain error range for the block size
  • an edge extracting filter by scanning the image in the block region with an edge extracting filter, the sum of all numeric values is calculated to obtain the feature amount (edge amount) of the block region.
  • a block having the feature amount within a predetermined error range can be a candidate region in which the specific pattern is searched.
  • the edge extracting filter is, for example, a Laplacian filter.
  • the image in the block region is converted into a binary image with a certain threshold, and then the co-occurrence characteristic of each pixel with the adjacent pixel is set to the feature amount.
  • the co-occurrence characteristic represents a degree of gathering of the pixels having the same pixel value and indicates a probability that a certain pixel has the same pixel value as the value of a pixel adjacent thereto.
  • a high co-occurrence characteristic implies that the pixels having the same value are densely present, while a low co-occurrence characteristic implies that the pixels having the same value are coarsely present.
  • the feature amount of the ith block region is calculated (S 304 ). Then, it is decided whether or not the feature amount is within a predetermined range, that is, the character amount is OK (S 305 ).
  • the predetermined range is set such that image in the same block region can be determined to be equivalent to a part of or whole of the pattern based on the calculated feature amount.
  • a pixel on the map image 22 corresponding to the ith block region is labeled (S 306 ).
  • the step S 306 is skipped.
  • the routine returns to a step S 304 when the pointer i does not exceed the number of all the block regions, and the above-mentioned process (S 304 to S 308 ) are repeated until all the process are ended for extraction and decision of the feature amount and the like for all the block region on the image 20 .
  • a process for retrieving a specific pattern is carried out by using the map image 22 (S 309 ), and the routine then returns.
  • step S 309 the retrieving process of the image (step S 309 ) will be described.
  • a region on the image 20 where a specific pattern might be present is presumed by using the map image 22 labeled based on the feature amount of the image 20 , and a specific pattern is retrieved for the presumed region on the image 20 .
  • the region on the map image 22 is presumed in the following manner.
  • the map image 22 labeled based on the feature amount of the image 20 is scanned by using a filter (hereinafter referred to as a “map filter”) for recognizing the predetermined pattern (hereinafter referred to as a “region specifying pattern”) for specifying a region, and the region specifying pattern is retrieved from the map image 22 .
  • a filter hereinafter referred to as a “map filter”
  • region specifying pattern for specifying a region
  • the region specifying pattern is retrieved from the map image 22 .
  • the map image 22 is scanned by using a map filter 22 f as shown in FIG. 6 .
  • a specific pattern is retrieved by using a filter (hereinafter referred to as an “image filter”) for recognizing a specific pattern in a region on the original image 20 corresponding to a position on the map image 22 where the region specifying pattern is detected.
  • a filter hereinafter referred to as an “image filter”
  • FIG. 7 shows an example of the map filter 22 f
  • FIG. 8 shows an example of the image filter 20 f
  • Each of these filters 22 f and 20 f has a central portion (a region hatched with a dot in the drawing) and a peripheral portion of the central portion (a region hatched with a slant line in the drawing). It is preferable that each filter 20 f or 22 f has size equal to the size of the specific pattern to be retrieved.
  • a position on an upper left end in each of the filters 20 f and 22 f is set to the position of the filter.
  • a circular pattern is used as an example of the specific pattern to be recognized and detected.
  • step S 309 The process of retrieving an image (step S 309 ) will be described with reference to a flowchart shown in FIG. 9 .
  • the position of the scanning start of the map filter 22 f is first determined. More specifically, the coordinates in the y direction of the map filter 22 f are set to be positioned on the upper end of the map image 22 (S 401 ), and coordinates in the x direction of the map filter 22 f are set to be positioned on the left end of the map image 22 (S 402 ).
  • a pattern is detected by using the map filter 22 f in that position (S 403 ).
  • counted are the numbers of pixels labeled in the central portion of the map filter 22 f and the peripheral portion of the central portion (which will be hereinafter referred to as a “peripheral portion”) shown in FIG. 7, respectively.
  • a region specifying pattern is determined to be detected when the number (M) counted in the central portion of the map filter 22 f is within a predetermined range of the central portion and the number (N) counted in the peripheral portion is within a predetermined range of the peripheral portion.
  • the region specifying pattern may be decided to be detected or found.
  • the predetermined ranges for the respective count numbers of M and N are properly set according to the retrieval precision of the map filter 22 f .
  • a detection position thereof is set to the coordinates on the upper left end of the map filter 22 f.
  • the map filter 22 f is moved by a predetermined amount (for example, one pixel) in the x direction (S 406 ). It is decided whether or not the right end of the map filter 22 f gets over the right end of the map image 22 , that is, the map filter 22 f gets out of the right end of the map image 22 (S 407 ). When the map filter 22 f does not get out of the map image 22 , the routine returns to S 403 where a region specifying pattern is detected in a new position. When the map filter 22 f gets out of the map image 22 , the map filter 22 f is moved by a predetermined amount (for example, one line) in the y direction (S 408 ).
  • a predetermined amount for example, one pixel
  • step S 405 The retrieving process on the original image (step S 405 ) will be described with reference to a flowchart shown in FIG. 10.
  • a position on the original image 20 corresponding to the position on the map image 22 where the region specifying pattern is detected is obtained, and a specific pattern is retrieved by using the image filter 20 f in the retrieving region set based on this position.
  • a block size of 8 is assigned to a variable m for storing a block size (S 501 ).
  • the block size may be 2 or more.
  • a position (X, Y) on the original image 20 corresponding to a position (x, y) of the region specifying pattern detected on the map image 22 is calculated (S 502 , S 503 ).
  • the corresponding position (X, Y) on the image 20 can be obtained by multiplying, by m, each position (x, y) on the map image 22 where the region specifying pattern is detected.
  • a block A′(X, Y) on the image 20 corresponding to a pixel A of the coordinate (x, y) on the map image 22 as shown in FIG. 11 indicates the following rectangular region:
  • a specific pattern is detected by using the imager filter 20 f in that position (S 504 ).
  • the specific pattern is detected by using the image filter 20 f in the following manner, for example.
  • the feature amount of each pixel is extracted in the central portion of the image filter 20 f shown in FIG. 8 (a region hatched with a dot) and a peripheral portion thereof (a region hatched in a slant line).
  • the pixel is labeled.
  • the labeled pixel is counted in the central and peripheral portions of the image filter 20 f , respectively.
  • a specific pattern is decided to be detected.
  • a specific pattern is present in the central portion of the image filter 22 f (in a circle shown in a broken line of the drawing).
  • the predetermined process includes, for example, a process of recording the detected coordinate value of the pattern on a storage device such as a hard disk, a process of displaying, on a display unit, a message saying that the specific pattern has been detected, and the like.
  • the step S 506 is skipped.
  • the X coordinate of the image filter 20 f is incremented (S 507 ) and it is decided whether or not the right end of the image filter 20 f gets over a scanning range (S 508 ).
  • the routine returns to the step S 504 where a specific pattern is detected in a new position shifted in the X direction in the same manner as described above.
  • the Y coordinate of the image filter 20 f is incremented (S 509 ) and it is decided whether or not the lower end of the image filter 20 f gets over the scanning range (S 510 ).
  • the routine returns to the step S 503 where a specific pattern is detected in a position on the left end of the scanning range which is lower by one line than the position at the last time.
  • the above-mentioned processes (S 503 to S 510 ) are repeated until the scanning operation by the image filter 20 f over the predetermined scanning range is completed.
  • the image processing apparatus creates the map image 22 having pixels each corresponding to the block region of the original image 20 by one to one, and each pixel of the map image 22 is labeled based on the feature amount of the image 20 , when the specific pattern is retrieved from the image 20 . Then, the region where the specific pattern is present is specified on the image based on the pattern of the map image 22 formed by the labeling, and the process of retrieving a specific pattern is carried out in the region on the original image 20 corresponding to the specified region on the map image 22 . More specifically, when the block size of the original image 20 is m ⁇ n, the size of the map image 22 is one m*nth of the image 20 and is smaller than the image 20 .
  • the position of the specific pattern is specified by using the map image 22 having a smaller size so that a working region required for the process is reduced. Furthermore, a processing time to be required can also be shortened. As a result, the specific pattern on the image can be retrieved at a high speed. Moreover, since the map image 22 is labeled based on the feature amount of the original image 20 , the characteristics of the original image 20 are kept. Consequently, it is possible to specify the detection region with high precision.
  • An image processing apparatus serves to detect and label, as a connecting region in a binary image, a continuous region composed of pixels which have same pixel values.
  • FIG. 12 is a flowchart showing a main routine of a program to be executed by a CPU 201 in a system according to the present embodiment.
  • “LABELING OF CONNECTION REGION” is selected at a step S 12 , the labeling process is performed in which a connecting region in an input image is labeled (S 13 ).
  • the labeling processing of a connecting region (step S 13 ) will be described below.
  • the image processing apparatus detects and labels, as a connecting region, a continuous region including pixels each having a value of “1” in a binary image constituted by a pixel having a value of “0” or “1”.
  • the image processing apparatus detects a connecting region in the image by scanning the image. The scanning operation is carried out as shown in FIG. 4 . First of all, the scanning starts from the left upper end of the image 20 , and then is shifting in the right direction (x direction). Returning to the left end of the image and shifts downward (in y direction) by one line when the scanning reaches to the right end of the image 20 , the scanning is shifting from the position in the x direction.
  • the image processing apparatus scans the image 20 .
  • one pixel (a) (having a pixel value of “1”) constituting the connecting region D is first detected, it is examined whether or not a pixel adjacent to the pixel (a) constitutes the connecting region D together with the pixel (a).
  • the adjacent pixel constitutes the connecting region D
  • it is examined whether or not a pixel adjacent to the adjacent pixel similarly constitutes the connecting region D.
  • a pixel adjacent to the adjacent-adjacent pixel is further examined in the same manner. As long as a pixel constituting the connecting region D is continuously detected in the same manner, the same examination is carried out for a pixel adjacent to the detected pixel.
  • the pixel (a) when the pixel (a) is first detected as a pixel constituting the connecting region D, it is examined whether or not the value of a pixel adjacent to the pixel (a) is “1”. When a pixel (b) having a value of “1” adjacent to the pixel (a) is found, a pixel adjacent to the pixel (b) is further examined. When a pixel (c) having a value of “1” adjacent to the pixel (b) is found, a pixel adjacent to the pixel (c) is further examined. Subsequently, as long as the pixel having a value of “1” is found, the examination is continued in the same manner. Consequently, it is possible to finally detect one connecting region D, that is, all the pixels constituting the connecting region D. Then, all the detected pixels are labeled with the same number.
  • the image processing apparatus has an examination state flag, a next examination candidate stack and a connecting region stack. These are logically constituted on a RAM 204 as shown in FIG. 15, for example.
  • the examination state flag 15 includes an examination-end flag which corresponds to each pixel of the image 20 and indicates whether or not each pixel has been examined.
  • the examination state flag comprises one-dimensional array data, and one array specified by the index of a pixel corresponds to one examination-end flag.
  • the index is information such as a number for specifying each pixel in an image.
  • the examination-end flag of “1” (ON) indicates that the pixel has been examined, and the examination-end flag of “0” (OFF) indicates that the pixel has not been examined yet.
  • a next examination candidate stack 17 serves to store the index of a pixel to which adjacent pixel is to be examined.
  • a connecting region stack 19 serves to store the index of a pixel constituting the connecting region.
  • Both stacks 17 and 19 are memories or storage means having a Last In First Out (LIFO) function. By pushing, data are stored in the stacks 17 and 19 , and by popping, the stored data are fetched from the stacks 17 and 19 .
  • Both stacks 17 and 19 may store other information for specifying a pixel in place of the index of a pixel.
  • the stacks 17 and 19 may also be implemented by storage means without LIFO function. In this case, it is necessary to separately manage information about the fact that data are read out from the storage means.
  • step S 13 The above-mentioned labeling process of the connecting region (step S 13 ) will be described with reference to flowcharts shown in FIGS. 16 and 17.
  • an image 21 shown in FIG. 18A is used as an image to be subjected to the labeling process.
  • the image 21 has a size of 5 pixels ⁇ 5 pixels, and has two connecting regions E and F.
  • a value indicated in each pixel of the image 21 means a pixel value.
  • a numeral in parentheses which is allocated to each pixel of the image 21 means the index of the pixel. The index is calculated by a following equation based on a coordinate value (x, y);
  • pixel index x+y ⁇ (image size in the x direction).
  • the image data of the binary image 21 are first input (S 1301 ). More specifically, the image data of the binary image 21 are loaded into the RAM 204 .
  • the examination state flag 15 is created (S 1302 ). That is, the working region of the examination state flag 15 is kept in a predetermined region on the RAM 204 and the examination-end flag constituting the examination state flag 15 is initialized to “0”.
  • the examination state flag 15 can also be implemented by assigning one of the plurality bits to the examination-end flag.
  • next examination candidate stack 17 and the connecting region stack 19 are created (S 1303 , S 1304 ). More specifically, the working regions of the next examination candidate stack 17 and the connecting region stack 19 are held in predetermined regions on the RAM 204 .
  • variables SIZEX, SIZEY and LABEL are initialized (S 1305 ). Concretely, the variable SIZEX is set to the x-direction size value of the image 21 (for example, 5), the variable SIZEY is set to the y-direction size value of the image 21 (for example, 5), and the variable LABEL is set to 1. The variable LABEL gives a label number for labeling the connecting region.
  • a variable y for giving the y coordinate of an examination position is initialized to 0 (S 1306 ), and furthermore, a variable x for giving the x coordinate of the examination position is initialized to 0 (S 1307 ).
  • a connecting region detecting process for detecting and labeling a connecting region including the pixel S 1309
  • the examination-end flag is set to ON for the examined pixel (S 1310 ) and the routine then proceeds to a step S 1311 .
  • the variable x is incremented to shift the examination position in the x direction, and then the variable x is compared with the variable SIZEX to decide whether or not the examination position is outside the range of the image 21 (S 1312 ).
  • the routine returns to the step S 1308 where the connecting region is examined in a new position shifted in the x direction in the same manner as described above.
  • the variable y is incremented to shift the examination position in the y direction (S 1313 ), and the variable y is then compared with the variable SIZEY to decide whether or not the examination position is outside the range of the image 21 (S 1314 ).
  • the routine returns to the step S 1307 where the connecting region is examined in a new position on the left end of the image 21 which is lower by one line than the last position, in the same manner as described above at the last time.
  • the routine returns. The above-mentioned processes are repeated until the whole image 21 is examined and all the connecting regions included in the image are detected to be labeled (S 1307 to S 1314 ).
  • step S 1309 the above-mentioned connecting region detecting process (step S 1309 ) is described with reference to the flowchart shown in FIG. 17 .
  • it is examined whether or not a pixel on the periphery of the first pixel of the connecting region constitutes the same connecting region, thereby detecting and labeling the connecting region.
  • the index of the first pixel of the connecting region is first pushed into the next examination candidate stack 17 (S 1401 ), and is also pushed into the connecting region stack 19 (S 1402 ).
  • the examination-end flag of the first pixel of the connecting region is set ON (S 1403 ).
  • the index is popped from the next examination candidate stack 17 (S 1404 ).
  • the adjacent pixel Pi sequentially sets pixels of P0, P1, P2, . . . P7 counterclockwise from the lower left pixel in eight pixels around a central pixel (the pixel (a)) as shown in FIG. 19 .
  • Four pixels on the upper, lower, right and left portions to the central pixel may be set as the adjacent pixel.
  • a variable i for specifying the adjacent pixel P i is initialized to 0 (S 1406 ). It is decided whether or not the adjacent pixel P i indicated by the variable i has not been examined but has been labeled (S 1407 ). When the pixel has not been examined but has been labeled, the index of the pixel P i is pushed into both stacks 17 and 19 (S 1408 ). Then, the examination-end flag of the pixel P i is set ON (S 1409 ). When the pixel has been examined or not labeled, these steps (S 1408 , S 1409 ) are skipped. Thereafter, the variable i is incremented (S 1410 ), and the above-mentioned process (S 1406 to S 1411 ) are repeated until the end of the examination whether or not all the adjacent pixels P i are in the connecting region.
  • the routine returns to the step S 1404 .
  • the routine proceeds to a step S 1413 .
  • the connecting region is examined for a pixel adjacent to the pixel indicated by the index stored in the next examination candidate stack 17 (S 1404 to S 1412 ).
  • One connecting region is constituted by pixels indicated by all the indexes stored in the connecting region stack 19 when there is no index stored in the next examination candidate stack 17 .
  • the routine returns.
  • all the indexes are popped from the connecting region stack 19 (S 1414 ), and the pixel indicated by the popped index is labeled with a label number indicated by the variable LABEL (S 1415 ). Then, the variable LABEL is incremented (S 1416 ) and the routine then returns.
  • the region when the size of the connecting region is smaller than a predetermined value, the region may be treated as a noise so as not to carry out the labeling process.
  • a process for initializing the variable SUM to 0 is added in the step S 1305 of the above flowchart, and furthermore a step for incrementing the variable SUM is added between the steps S 1402 and S 1403 and between the steps S 1408 and S 1409 .
  • the step S 1413 is changed such that the variable SUM is compared with a predetermined value and the routine proceeds to the step S 1414 when the variable SUM is equal to or greater than the predetermined value and the routine otherwise returns.
  • the predetermined value for example, is set to 8 in the case where a small connecting region having pixels less than eight pixels is to be ignored as a noise. Furthermore, when the connecting region size exceeds a maximum size, the labeling process of the connecting region can be terminated. At this time, the step S 413 is changed such that the variable SUM is compared with a predetermined value (a maximum size value) and the routine returns when the variable SUM is equal to or greater than a predetermined value, or otherwise the routine proceeds to the step S 1414 . While the size of the connecting region is calculated by the number of pixels constituting the connecting region as described above, the size of the connecting region may be obtained from the sizes of the detected connecting region in the x and y directions.
  • FIGS. 20A to 20 H are diagrams showing a change in the states of the examination state flag 15 , the next examination candidate stack 17 and the connecting region stack 19 which are obtained when the image 21 is subjected to the connecting region labeling process according to the process of the flowchart described above.
  • FIG. 20A shows a state before the examination is started. All the examination-end flags are initialized to “0” and no index is stored in both stacks 17 and 19 .
  • FIG. 20B shows a state in which the first pixel of the connecting region is detected. The pixel of an index of “6” is detected as the first pixel in the connecting region, and the index of “6” is pushed into the next examination candidate stack 17 and the connecting region stack 19 .
  • FIG. 20C shows a state in which the index of “6” is popped from the next examination candidate stack 17 and the examination of the pixel P 0 adjacent to the first pixel in the connecting region specified by the index is completed. The examination-end flag of the pixel P 0 is set to “1”.
  • FIG. 20D shows a state in which the examination of the adjacent pixel P 1 is completed.
  • the pixel P 1 has not been examined yet but has been labeled. Therefore, the examination-end flag is set to “1” and an index of “11” of the pixel P 1 is pushed into the next examination candidate stack 17 and the connecting region stack 19 .
  • FIG. 20E shows a state in which the examination of the adjacent pixel P 2 is completed. The pixel P 2 has not been examined but has been labeled. Therefore, the examination-end flag is set to “1” and an index of “12” of the pixel P 2 is pushed into the next examination candidate stack 17 and the connecting region stack 19 .
  • FIG. 20F shows a state in which the examination is completed for the pixel P 0 adjacent to the pixel indicated by the index of “12”.
  • the examination-end flag of the pixel P 0 is set to “1”.
  • FIG. 20G shows a state in which the examination is completed for the pixels P 0 to P 7 adjacent to the pixel indicated by the index of “12”.
  • FIG. 20H shows a state in which the examination is completed for the pixel adjacent to the pixel indicated by the index of “11” which is popped from the next examination candidate stack 17 .
  • the index stored in the connecting region stack 19 at this time indicates a pixel constituting a first connecting region E of the image 21 .
  • the image processing apparatus detects each of the independent connecting regions. Therefore, the label number is determined for the region simultaneous with the detection of the connecting region. Consequently, a work for re-labeling a label number according to the prior art is not required. Accordingly, the work for re-labeling the connecting region can be carried out by one-time image scanning. Therefore, it is not necessary to carry out the scanning twice differently from the prior art, and the labeling process can be executed at a high speed. Moreover, the connecting region is detected one by one for the labeling. Therefore in the environment capable of executing the parallel processing, in the image labeling process, for example, a predetermined process including image recognition process can be carried out for the detected one connecting region, while the other connecting regions can be detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides an image processing apparatus capable of retrieving a specific pattern from an image at a high speed in an image recognizing processing. The apparatus creates a map image having pixels correspond to block regions of the image by one to one when retrieving a specific pattern from the image. Then, the apparatus labels each pixel of the map image based on feature of each block region of the image, detects a predetermined pattern specifying a region on the map image, specifies a block region on the original image corresponding to a position of the detected predetermined pattern on the map image, and retrieves the specific pattern in the specified block region.

Description

This application is a continuation divisional of application Ser. No. 09/538,735, filed on Mar. 30, 2000 now U.S. Pat. No. 6,643,400.
This application is based on applications Nos. 11-92050 and 11-92094 filed in Japan, the contents of which are incorporated herein by reference.
TITLE OF THE INVENTION
Image Processing Apparatus And Method for Recognizing Specific Pattern and Recording Medium Having Image Processing Program Recorded Thereon
BACKGROUND OF THE INVENTION
1. Technical Field of the Invention
The present invention relates to an image processing technique for recognizing a specific pattern included in an image.
2. Related Art
Conventionally, in an image processing apparatus, the position of a specific pattern included in an image has been detected and a pattern matching processing has been carried out to recognize the shape of the specific pattern by storing data on each pixel fetched into a camera or the like as binary data or multivalued data in an image memory and matching, for each pixel, the stored data with data on each pixel of a reference pattern stored beforehand in the memory. In this matching, the whole image has been scanned while shifting the reference pattern every pixel for an image including the specific pattern in the x or y direction of the image.
In this method, however, the image including the specific pattern and the reference pattern are shifted and matched every pixel. Therefore, there has been a problem in that a very long time is required for the process.
In the image recognizing process, moreover, examples of a method for detecting the position of the specific pattern of an image implementing a high speed include a method for creating a reduced image from the image, performing a pattern matching on the reduced image to retrieve the specific pattern, and carrying out retrieval in the region of an original image corresponding to the detecting position of the specific pattern on the reduced image again. In this method, however, the specific pattern cannot be detected accurately depending on an algorithm for creating the reduced image in some cases. For example, in the case of the image having a pixel value which is alternately different for each pixel such as checkers, when thinning is carried out every other pixel to create a reduced image, the reduced image thus created has only one of pixel values appearing on an original image. Consequently, a correct specific pattern cannot be detected from such a reduced image.
Moreover, examples of a process to be carried out prior to a process such as image recognition include a labeling process of detecting a region to be formed by pixels having the same value which are continuous in a binary image (hereinafter referred to as a “connecting region”) and giving a label number to the connecting region. As a method for the labeling process, various methods have conventionally been proposed.
FIG. 21 is a diagram illustrating one example of the labeling processes in the connecting region of the image according to the prior art. In FIG. 21, an original image 20 includes three connecting regions A, B and C. These connecting regions A, B and C are continuous regions comprising pixels having pixel value of “1”. When labeling the connecting region of the image, the whole image is scanned to examine the pixel value of each pixel and to give a provisional label number for a pixel decided to constitute the connecting region. At this time, the provisional label number is not always accurate. Therefore, re-labeling of the label number is carried out later in some cases. For this reason, information for the re-labeling is stored in tables 51 and 53. Then, an image is scanned again to replace the provisional label number with a real label number with reference to the tables 51 and 53.
More specifically, when a pixel having a pixel value of “1” is detected during the scanning of the image 20 in a first scanning operation, it is decided whether or not there are any adjacent pixels having a label number given thereto. If there is an adjacent pixel having a label given thereto, the same label number as in the adjacent pixel is given to the detected pixel. When a plurality of adjacent pixels have the label numbers given thereto, the same label number as in the adjacent pixel having a higher priority is given based on a predetermined priority. When there is no adjacent pixel having the label number given thereto, a new label number is given to the detected pixel. Thus, the inside of the image is scanned to give the label number of the pixel having the pixel number of “1”.
As described above, in the image having the label number given thereto, pixels having the same label number given thereto form one connecting region. However, a plurality of label numbers are given to one connecting region depending on the shape of the connecting region in some cases. As shown in FIG. 21, for example, a connecting region A of the image 20 is labeled as three regions having label numbers 2, 3 and 4 in an image 20′ as a result of the above-mentioned labeling. In the labeling process, a link table 51 is created. The link table 51 stores information indicating that three regions having the label numbers 2, 3 and 4 are identical. Furthermore, label editing is carried out to create a conversion table 53 for setting the label numbers 2, 3 and 4 to 2 and re-labeling the label numbers 3 and 4 thus generating a vacancy into a region having label numbers 5 and 6.
Then, in a second scanning operation, each of the pixels of the image 20′ is scanned and a label number is converted (re-labeled) into a real label number by referring to the information of the conversion table 53. Consequently, each region is finally labeled correctly as in an image 20″.
In the above-mentioned method, however, it is necessary to scan the whole image twice. Therefore, a long time is taken to perform the labeling process. The time required for the processing is prolonged as the size of the image is increased.
SUMMARY OF THE INVENTION
The present invention is directed to solve the above-mentioned problem, and has an object to provide an image processing apparatus and method capable of retrieving a specific pattern from an image at a high speed and a recording medium storing a program therefor.
The invention has another object to provide an image processing apparatus and method for a labeling processing in the connecting region of an image at a high speed, and a recording medium storing a program therefor.
In a first aspect of the invention, an image processing apparatus is provided for retrieving a specific pattern of an image. The apparatus comprises a unit for dividing the image into a plurality of block regions, each of the block regions having a predetermined size, a unit for creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one, a unit for extracting features of the image for each of the block regions, a unit for deciding whether or not the extracted feature meets a predetermined condition, a unit for labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition, a unit for detecting a predetermined pattern for specifying a search region on the image from the labeled map image, a unit for specifying the search region based on a position on the map image in which the predetermined pattern is detected, and a unit for retrieving the specific pattern in the specified search region on the image.
In the apparatus, at least one of a concentration value of a pixel, an edge amount and a co-occurrence characteristic may be extracted as the feature amount. In the apparatus, the predetermined pattern for determining the search region may be detected by using a filter having the same size as a size of the specific pattern in the map image.
In a second aspect of the invention, an image processing method is provided for retrieving a specific pattern of an image. The method comprises dividing the image into a plurality of block regions, each of the block regions having a predetermined size, creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one, extracting features of the image for each of the block regions, deciding whether or not the extracted feature meets a predetermined condition, labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition, detecting a predetermined pattern for specifying a search region on the image from the labeled map image, specifying the search region based on a position on the map image in which the predetermined pattern is detected, and retrieving the specific pattern in the specified search region on the image.
In a third aspect of the invention, a computer readable recording medium storing a program for controlling the computer to retrieve a specific pattern from an image is provided. The program is capable of executing the following functions: a function for dividing the image into a plurality of block regions, each of the block regions having a predetermined size; a function for creating a map image comprising a plurality of pixels, each of the pixels corresponding to one of the block regions by one to one; a function for extracting features of the image for each of the block regions; a function for deciding whether or not the extracted feature meets a predetermined condition; a function for labeling the pixel on the map image based on a result of the decision, the pixel corresponding to the block region of which feature meets the condition; a function for detecting a predetermined pattern for specifying a search region on the image from the labeled map image; a function for specifying the search region based on a position on the map image in which the predetermined pattern is detected; and a function for retrieving the specific pattern in the specified search region on the image.
In a fourth aspect of the invention, an image processing apparatus is provided for labeling a connecting region of an image. The apparatus comprises a flag memory including flags indicating whether or not each pixel of the image is finished with examination, a candidate stack for storing indexes, each of indexes specifying a pixel of which adjacent pixels are to be examined, a region memory for storing indexes, each of indexes indicating one of pixels constituting the connecting region, a labeling unit for labeling pixels specified by the indexes stored in the region memory, and a connecting region detecting unit for detecting the pixels constituting the connecting region. The connecting region detecting unit scans the image, examines whether or not a pixel of the image constitutes the connecting region, sets the flag for the examined pixel. When the pixel constituting the connecting region is first detected in the examination, the connecting region detecting unit (1) pushes an index of the detected pixel into the candidate stack, and stores the index to the region memory, 2) while there is at least one index stored in the candidate stack, repeats a) popping an index from the candidate stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the candidate stack and storing the indexes to the region memory, and c) setting the flags for the adjacent pixels.
In the apparatus, the labeling unit may calculate a size of a region including pixels indicated by indexes stored in the memory when labeling pixels, and prohibit labeling the pixel when the calculated size is out of a predetermined range. In the apparatus, the region memory may comprise a stack memory.
In a fifth aspect of the invention, an image processing method is provided for labeling a connecting region of an image. The method comprises scanning the image, examining whether or not a pixel of the image constitutes the connecting region, setting the flag for the examined pixel, and when the pixel constituting the connecting region is first detected in the examination, (1) pushing an index of the detected pixel into a stack, and storing the index to a memory, (2) while there is at least one index stored in the stack, repeating a) popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) setting the flags for the adjacent pixels, (3) reading all indexes out from the memory when there is no index stored in the stack to label pixels indicated by the read indexes with same number.
In a sixth aspect of the invention, a computer readable recording medium in which a program for controlling the computer to label a connecting region of an image is provided. The program is capable of executing the following functions: a function for scanning the image; a function for examining whether or not a pixel of the image constitutes the connecting region; a function for setting the flag for the examined pixel; and when the pixel constituting the connecting region is first detected in the examination, (1) a function for pushing an index of the detected pixel into a stack, and storing the index to a memory, (2) while there is at least one index stored in the stack, repeating a) a function for popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, a function for pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) a function for setting the flags for the adjacent pixels, (3) a function for reading all indexes out from the memory when there is no index stored in the stack to label pixels indicated by the read indexes with same number.
This object as well as other objects, features and advantages of the invention will become more apparent to those skilled in the art from the following description with reference to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram schematically showing the structure of an image processing apparatus according to the present invention.
FIG. 2 is a block diagram of the image processing apparatus, mainly showing a controller.
FIG. 3 is a flowchart showing a main routine in the image processing apparatus according to a first embodiment.
FIG. 4 is a diagram showing an object image (an original image) from which a specific pattern is retrieved and a map image corresponding thereto.
FIG. 5 is a flowchart showing an image (pattern) recognition process.
FIG. 6 is a diagram illustrating the scanning by a map filter on a map image.
FIG. 7 is a diagram showing an example of the map filter.
FIG. 8 is a diagram showing an example of an image filter.
FIG. 9 is a flowchart showing the detecting process on an image.
FIG. 10 is a flowchart showing the detecting process on an original image.
FIG. 11 is a diagram illustrating a correspondence of a position A specified on the map image and a search region A′ (scanning range) on the original image.
FIG. 12 is a flowchart showing a main routine in the image processing apparatus according to a second embodiment.
FIG. 13 is a diagram illustrating the scanning of a connecting region.
FIG. 14 is a diagram illustrating a state of expansion of adjacent pixels to be examined.
FIG. 15 is a diagram showing an examination-end flag, a next examination candidate stack and a connecting region stack which are logically constituted on a RAM.
FIG. 16 is a flowchart showing a labeling process of the connecting region.
FIG. 17 is a flowchart showing a connecting region detecting process.
FIG. 18A is a diagram showing an image including the connecting region comprising pixels with values of “1”.
FIG. 18B is a diagram showing the index of each pixel of an image including the connecting region.
FIG. 19 is a diagram illustrating pixels adjacent to the pixel to be examined.
FIGS. 20A to 20H are diagrams illustrating a transition of the state of the examination-end flag, the next examination candidate stack and the connecting region stack.
FIG. 21 is a diagram illustrating a labeling process in a connecting region according to the prior art.
DETAILED DESCRIPTION OF THE INVENTION
An image processing apparatus according to preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
First Embodiment
An image processing apparatus according to the present embodiment serves to retrieve a specific pattern in an image.
Structure of Image Processing Apparatus
FIG. 1 is a diagram showing the schematic structure of an image processing apparatus according to the present embodiment (which will be hereinafter referred to as a “system”). As shown in FIG. 1, the system comprises a central processing unit (which will be hereinafter referred to as a “CPU”), and is mainly constituted by a controller 1 for controlling the whole system. For example, Pentium manufactured by Intel Co., Ltd. or the like is used for the CPU. To the controller 1 are connected a display 2 for performing the display of images, characters or the like, the display for operations and the like, a keyboard 3 and a mouse 4 for carrying out various inputs, indicating operations and the like, a floppy disk drive 5 a and a hard disk drive 6 which are data saving media, a printer 7 for printing characters, image data and the like, a scanner 8 for fetching image data, a CD-ROM drive 9 b for reading data stored in a CD-ROM 9 a, a speaker 10 for outputting a voice, and a microphone 11 for inputting a voice.
FIG. 2 is a block diagram showing the present system. A ROM 203 in which a program for controlling the system is stored and a RAM 204 for temporarily storing a program and data to be executed for control by the CPU 201 are connected to the CPU 201 through a data bus 220. Moreover, a circuit to be connected to the CPU 201 through the data bus 220 includes a display control circuit 205 for controlling the display 2 for displaying images, characters or the like, a keyboard control circuit 206 for transferring and controlling an input from the keyboard 3, a mouse control circuit 207 for transferring and controlling an input from the mouse 4, a floppy disk drive control circuit 208 for controlling the floppy disk drive 5 b, a hard disk drive control circuit 209 for controlling the hard disk drive 6, a printer control circuit 210 for controlling an output to the printer 7, a scanner control circuit 211 for controlling the scanner 8, a CD-ROM drive control circuit 212 for controlling the CD-ROM drive 9 b, a speaker control circuit 213 for controlling the speaker 10, and a microphone control circuit 214 for controlling the microphone 11. Furthermore, a clock 202 for generating a reference clock necessary for operating the system is connected to the CPU 201, and an expansion slot 215 for connecting various extension boards is connected to the CPU 201 through the data bus 220. A SCSII board is connected to the expansion slot 215, and the floppy disk unit 5 b, the hard disk drive 6, the scanner 8, the CD-ROM drive 9 b or the like may be connected through the SCSII board.
While the floppy disk drive 5 a and the hard disk drive 6 have been used as the data storage media in the above-mentioned system, other information recording media such as a magneto-optic disk (MO) and the like may be used. Moreover, while the scanner 8 is used as the image data input device, other data input devices such as a steel video camera, a digital camera and the like may be used. Furthermore, while the printer 7 has been used as an output device, other output devices such as a digital printer may be used. In the present system, moreover, a program for implementing a data management system is stored in the ROM 203. However, a part of or all the programs may be stored in the information recording media such as the floppy disk drive 5 a, the hard disk drive 6, the CD-ROM drive 9 b and the like, and a program and data may be read out from the information recording media to the RAM 204 if necessary and may be executed.
Main Routine
FIG. 3 is a flowchart showing a main routine of a program to be executed by the CPU 201 in the system. The system (CPU 201) sequentially executes each process according to user's operation.
When the program is started, first of all, an initializing process for initializing a flag or the like necessary for each of the subsequent process and for displaying an initial menu screen and the like is carried out (S1). Icons for selecting a process such as “IMAGE RECOGNITION” “OTHERS” and the likes are displayed on an initial menu screen. When the user selects one of the icons, a process corresponding to the icon is executed. Next, it is decided whether or not any process is selected by the user on the initial menu screen (S2). If the “IMAGE RECOGNITION” is selected at the step S2, the routine proceeds to an image recognizing process (S3) for performing a process of confirming the presence of a pattern having a specific layout (referred to as “layout pattern”) in an input image. Then, the routine returns to the step S2. At the step S2, “OTHERS” is selected, another process (S4) which is a predetermined process other than the image recognition is carried out. Then, the routine returns to the step S2. At the step S2, “END” is selected, a predetermined post-process (S5) for terminating the operation of the system is carried out to end the control. If nothing is selected at the step S2, the user's selection of any of the process is waited.
Since another process (step S4) and a post-process (step S5) are basically the same as well-known process to be used in a general information processing system, their description will be omitted. The image recognition process (step S3) will be described later in detail.
Image Recognition Process
The summary of the image recognition process of the system will be described below with reference to FIG. 4. In the image recognition process, when a specific pattern is retrieved in the image 20 to be retrieved, used is a map image 22 which is an image indicative of a region in the image 20 having feature that is coincide exactly or approximately with a part of or whole specific patterns. The map image 22 is created such that one pixel corresponds to one block obtained by dividing the whole region of the image 20 into some block regions having a predetermined size (8×8 pixels). Accordingly, the size of the map image 22 is an eighth of the original image 20 in the x and y directions in FIG. 4. The system uses the map image 22 to retrieve the specific pattern in the following manner. The system extracts a feature of the image 20 for each block region, examines the feature for each block region and records the result on the pixel of the map image 22 corresponding to the block region of the image 20 (labeling). Then, the system presumes a region where the specific pattern on the image 20 is present based on the pixel pattern formed by the labeling on the map image 22, and examines the presumed region on the image 20 in detail, thereby retrieving the specific pattern. Thus, specifying the retrieval region by using the map 22 having a small size reduces a processing time required for specifying the retrieval region. Therefore the specific pattern retrieval can be carried out at a high speed. This process will be described below with reference to a flowchart shown in FIG. 5.
As shown in FIG. 5, when the present process is started, image data of the image 20 is input (S301). The image 20 is an object to be retrieved with the specific pattern. More specifically, the image data of the image 20 designated by the user is loaded into the RAM 204. The input image 20 is divided into some block regions each having a predetermined size. The block size can be set to an optional size (m×n pixels). Next, the map image 22 is created (S302). More specifically, a working region on the RAM 204 for the map image 22 is kept and all the pixel values of the map image 22 are initialized to a constant value. When the size of the original image is set to Sx pixels×Sy pixels, (Sx/m) pixels×(Sy/n) pixels is obtained as the size of the map image. In the following, the block size of 8 pixels×8 pixels will be described.
Next, a pointer i indicative of a block region to be processed is set to 1 (S303). The feature of the image is calculated for the ith block region indicated by the pointer i (S304). The feature of the ith block region is obtained in the following manner. The feature is obtained for each pixel in the block region, and the pixel is labeled when the amount of feature is within a predetermined range. Then, the number of the labeled pixels in the block region is counted as the feature amount in the block. The feature includes a) the number of pixels within a specific range in a color space, b) the strength of an edge, c) the co-occurrence characteristic obtained by conversion to a binary image and the like. These features will be described simply.
a) Number of Pixels Within Specific Range in Color Space
When a specific pattern is constituted by a single color and has a uniform density within some error (difference) range for a block size, a pixel in a block region which has the same color as the specific pattern is labeled. In the block region, the labeled pixels are counted and the number thereof is set as the feature amount of the block region. At this time, a block region having the feature amount within a predetermined error range can be a candidate region in which the specific pattern is to be searched.
b) Strength of Edge
In the case where the specific pattern has an edge which is uniform within a certain error range for the block size, by scanning the image in the block region with an edge extracting filter, the sum of all numeric values is calculated to obtain the feature amount (edge amount) of the block region. A block having the feature amount within a predetermined error range can be a candidate region in which the specific pattern is searched. The edge extracting filter is, for example, a Laplacian filter.
c) Co-occurrence Characteristic Obtained By Conversion to Binary Image
When the specific pattern has a uniform co-occurrence characteristic within a certain error range for the block size, the image in the block region is converted into a binary image with a certain threshold, and then the co-occurrence characteristic of each pixel with the adjacent pixel is set to the feature amount. The co-occurrence characteristic represents a degree of gathering of the pixels having the same pixel value and indicates a probability that a certain pixel has the same pixel value as the value of a pixel adjacent thereto. A high co-occurrence characteristic implies that the pixels having the same value are densely present, while a low co-occurrence characteristic implies that the pixels having the same value are coarsely present.
Returning to FIG. 5, the feature amount of the ith block region is calculated (S304). Then, it is decided whether or not the feature amount is within a predetermined range, that is, the character amount is OK (S305). The predetermined range is set such that image in the same block region can be determined to be equivalent to a part of or whole of the pattern based on the calculated feature amount. When the feature amount is within a predetermined range, a pixel on the map image 22 corresponding to the ith block region is labeled (S306). When the feature amount is out of the predetermined range, the step S306 is skipped. Then, the pointer i is incremented (S307) It is decided whether or not the pointer i exceeds the number of all the block regions (S308), the routine returns to a step S304 when the pointer i does not exceed the number of all the block regions, and the above-mentioned process (S304 to S308) are repeated until all the process are ended for extraction and decision of the feature amount and the like for all the block region on the image 20. When those process for all the block region on the image 20 are ended, a process for retrieving a specific pattern is carried out by using the map image 22 (S309), and the routine then returns.
Next, the retrieving process of the image (step S309) will be described. In the process, a region on the image 20 where a specific pattern might be present is presumed by using the map image 22 labeled based on the feature amount of the image 20, and a specific pattern is retrieved for the presumed region on the image 20.
The region on the map image 22 is presumed in the following manner. The map image 22 labeled based on the feature amount of the image 20 is scanned by using a filter (hereinafter referred to as a “map filter”) for recognizing the predetermined pattern (hereinafter referred to as a “region specifying pattern”) for specifying a region, and the region specifying pattern is retrieved from the map image 22. More specifically, the map image 22 is scanned by using a map filter 22 f as shown in FIG. 6. When the region specifying pattern is found on the map image 22, a specific pattern is retrieved by using a filter (hereinafter referred to as an “image filter”) for recognizing a specific pattern in a region on the original image 20 corresponding to a position on the map image 22 where the region specifying pattern is detected.
FIG. 7 shows an example of the map filter 22 f and FIG. 8 shows an example of the image filter 20 f. Each of these filters 22 f and 20 f has a central portion (a region hatched with a dot in the drawing) and a peripheral portion of the central portion (a region hatched with a slant line in the drawing). It is preferable that each filter 20 f or 22 f has size equal to the size of the specific pattern to be retrieved. In the following description, a position on an upper left end in each of the filters 20 f and 22 f is set to the position of the filter. In the present embodiment, moreover, a circular pattern is used as an example of the specific pattern to be recognized and detected.
The process of retrieving an image (step S309) will be described with reference to a flowchart shown in FIG. 9. In the present process, the position of the scanning start of the map filter 22 f is first determined. More specifically, the coordinates in the y direction of the map filter 22 f are set to be positioned on the upper end of the map image 22 (S401), and coordinates in the x direction of the map filter 22 f are set to be positioned on the left end of the map image 22 (S402).
When the coordinates of the map filter 22 f are set, a pattern is detected by using the map filter 22 f in that position (S403). Concretely, counted are the numbers of pixels labeled in the central portion of the map filter 22 f and the peripheral portion of the central portion (which will be hereinafter referred to as a “peripheral portion”) shown in FIG. 7, respectively. A region specifying pattern is determined to be detected when the number (M) counted in the central portion of the map filter 22 f is within a predetermined range of the central portion and the number (N) counted in the peripheral portion is within a predetermined range of the peripheral portion. For example, when the number (M) counted in the central portion of the map filter 22 f is 8 or more and there is no count in the peripheral portion, that is, M>7 and N=0, the region specifying pattern may be decided to be detected or found. The predetermined ranges for the respective count numbers of M and N are properly set according to the retrieval precision of the map filter 22 f. When the region specifying pattern is detected, a detection position thereof is set to the coordinates on the upper left end of the map filter 22 f.
As a result of the detection in the above-mentioned manner, when the region specifying pattern is detected, a specific pattern is detected in a region on the original image 20 which corresponds to the detection position of the region specifying pattern (S405). This process will be described below in detail. When no region specifying pattern is detected, the step S405 is skipped.
Then, the map filter 22 f is moved by a predetermined amount (for example, one pixel) in the x direction (S406). It is decided whether or not the right end of the map filter 22 f gets over the right end of the map image 22, that is, the map filter 22 f gets out of the right end of the map image 22 (S407). When the map filter 22 f does not get out of the map image 22, the routine returns to S403 where a region specifying pattern is detected in a new position. When the map filter 22 f gets out of the map image 22, the map filter 22 f is moved by a predetermined amount (for example, one line) in the y direction (S408). It is decided whether or not the lower end of the map filter 22 f gets over the lower end of the map image 22, that is, the map filter 22 f gets out of the map image 22 (S409). When the map filter 22 f does not get out of the map image 22, the routine returns to the step S402 where the region specifying pattern is detected in a new position. Thus, the above-mentioned processes (S402 to S409) are repeated until the whole map image 22 is scanned by the map filter 22 f.
The retrieving process on the original image (step S405) will be described with reference to a flowchart shown in FIG. 10. A position on the original image 20 corresponding to the position on the map image 22 where the region specifying pattern is detected is obtained, and a specific pattern is retrieved by using the image filter 20 f in the retrieving region set based on this position.
First of all, a block size of 8 is assigned to a variable m for storing a block size (S501). The block size may be 2 or more. A position (X, Y) on the original image 20 corresponding to a position (x, y) of the region specifying pattern detected on the map image 22 is calculated (S502, S503). The corresponding position (X, Y) on the image 20 can be obtained by multiplying, by m, each position (x, y) on the map image 22 where the region specifying pattern is detected. As shown in FIG. 11, a block A′(X, Y) on the image 20 corresponding to a pixel A of the coordinate (x, y) on the map image 22 as shown in FIG. 11 indicates the following rectangular region:
X=m×x˜m×(x+1)−1
Y=m×y˜m×(y+1)−1.
By scanning the image 20 using the image filter 20 f within this range, a specific pattern is detected.
When the position (X, Y) of the image filter 20 f is determined, a specific pattern is detected by using the imager filter 20 f in that position (S504). The specific pattern is detected by using the image filter 20 f in the following manner, for example. The feature amount of each pixel is extracted in the central portion of the image filter 20 f shown in FIG. 8 (a region hatched with a dot) and a peripheral portion thereof (a region hatched in a slant line). When the extracted feature amount is within a predetermined range, the pixel is labeled. Next, the labeled pixel is counted in the central and peripheral portions of the image filter 20 f, respectively. When each of results obtained by counting is within a predetermined range in each of the central and peripheral portions, a specific pattern is decided to be detected. In this case, a specific pattern is present in the central portion of the image filter 22 f (in a circle shown in a broken line of the drawing).
Next, it is decided whether or not the specific pattern is detected (S505). When the specific pattern is detected, a predetermined process to be performed is carried out (S506). The predetermined process includes, for example, a process of recording the detected coordinate value of the pattern on a storage device such as a hard disk, a process of displaying, on a display unit, a message saying that the specific pattern has been detected, and the like. When the specific pattern is not detected, the step S506 is skipped. Then, the X coordinate of the image filter 20 f is incremented (S507) and it is decided whether or not the right end of the image filter 20 f gets over a scanning range (S508). When the image filter 20 f does not get over the right end, the routine returns to the step S504 where a specific pattern is detected in a new position shifted in the X direction in the same manner as described above. When the right end of the image filter 20 f gets over the scanning range, the Y coordinate of the image filter 20 f is incremented (S509) and it is decided whether or not the lower end of the image filter 20 f gets over the scanning range (S510). When the image filter 20 f does not get over the lower end, the routine returns to the step S503 where a specific pattern is detected in a position on the left end of the scanning range which is lower by one line than the position at the last time. The above-mentioned processes (S503 to S510) are repeated until the scanning operation by the image filter 20 f over the predetermined scanning range is completed.
As described above, the image processing apparatus according to the present embodiment creates the map image 22 having pixels each corresponding to the block region of the original image 20 by one to one, and each pixel of the map image 22 is labeled based on the feature amount of the image 20, when the specific pattern is retrieved from the image 20. Then, the region where the specific pattern is present is specified on the image based on the pattern of the map image 22 formed by the labeling, and the process of retrieving a specific pattern is carried out in the region on the original image 20 corresponding to the specified region on the map image 22. More specifically, when the block size of the original image 20 is m×n, the size of the map image 22 is one m*nth of the image 20 and is smaller than the image 20. Therefore, the position of the specific pattern is specified by using the map image 22 having a smaller size so that a working region required for the process is reduced. Furthermore, a processing time to be required can also be shortened. As a result, the specific pattern on the image can be retrieved at a high speed. Moreover, since the map image 22 is labeled based on the feature amount of the original image 20, the characteristics of the original image 20 are kept. Consequently, it is possible to specify the detection region with high precision.
Second Embodiment
An image processing apparatus according to the present embodiment serves to detect and label, as a connecting region in a binary image, a continuous region composed of pixels which have same pixel values.
Main Routine of Image Processing Apparatus
FIG. 12 is a flowchart showing a main routine of a program to be executed by a CPU 201 in a system according to the present embodiment. When “LABELING OF CONNECTION REGION” is selected at a step S12, the labeling process is performed in which a connecting region in an input image is labeled (S13). The labeling processing of a connecting region (step S13) will be described below.
Labeling Process of Connecting Region
The image processing apparatus detects and labels, as a connecting region, a continuous region including pixels each having a value of “1” in a binary image constituted by a pixel having a value of “0” or “1”. The image processing apparatus detects a connecting region in the image by scanning the image. The scanning operation is carried out as shown in FIG. 4. First of all, the scanning starts from the left upper end of the image 20, and then is shifting in the right direction (x direction). Returning to the left end of the image and shifts downward (in y direction) by one line when the scanning reaches to the right end of the image 20, the scanning is shifting from the position in the x direction.
When there is a connecting region D in the image 20 as shown in FIG. 13, the image processing apparatus scans the image 20. When one pixel (a) (having a pixel value of “1”) constituting the connecting region D is first detected, it is examined whether or not a pixel adjacent to the pixel (a) constitutes the connecting region D together with the pixel (a). When the adjacent pixel constitutes the connecting region D, it is examined whether or not a pixel adjacent to the adjacent pixel similarly constitutes the connecting region D. When the adjacent-adjacent pixel constitutes the connecting region D, a pixel adjacent to the adjacent-adjacent pixel is further examined in the same manner. As long as a pixel constituting the connecting region D is continuously detected in the same manner, the same examination is carried out for a pixel adjacent to the detected pixel.
As shown in FIG. 14, when the pixel (a) is first detected as a pixel constituting the connecting region D, it is examined whether or not the value of a pixel adjacent to the pixel (a) is “1”. When a pixel (b) having a value of “1” adjacent to the pixel (a) is found, a pixel adjacent to the pixel (b) is further examined. When a pixel (c) having a value of “1” adjacent to the pixel (b) is found, a pixel adjacent to the pixel (c) is further examined. Subsequently, as long as the pixel having a value of “1” is found, the examination is continued in the same manner. Consequently, it is possible to finally detect one connecting region D, that is, all the pixels constituting the connecting region D. Then, all the detected pixels are labeled with the same number.
To implement the above process, the image processing apparatus has an examination state flag, a next examination candidate stack and a connecting region stack. These are logically constituted on a RAM 204 as shown in FIG. 15, for example. The examination state flag 15 includes an examination-end flag which corresponds to each pixel of the image 20 and indicates whether or not each pixel has been examined. For example, the examination state flag comprises one-dimensional array data, and one array specified by the index of a pixel corresponds to one examination-end flag. The index is information such as a number for specifying each pixel in an image. The examination-end flag of “1” (ON) indicates that the pixel has been examined, and the examination-end flag of “0” (OFF) indicates that the pixel has not been examined yet. A next examination candidate stack 17 serves to store the index of a pixel to which adjacent pixel is to be examined. A connecting region stack 19 serves to store the index of a pixel constituting the connecting region. Both stacks 17 and 19 are memories or storage means having a Last In First Out (LIFO) function. By pushing, data are stored in the stacks 17 and 19, and by popping, the stored data are fetched from the stacks 17 and 19. Both stacks 17 and 19 may store other information for specifying a pixel in place of the index of a pixel. The stacks 17 and 19 may also be implemented by storage means without LIFO function. In this case, it is necessary to separately manage information about the fact that data are read out from the storage means.
The above-mentioned labeling process of the connecting region (step S13) will be described with reference to flowcharts shown in FIGS. 16 and 17.
In the following description, an image 21 shown in FIG. 18A is used as an image to be subjected to the labeling process. The image 21 has a size of 5 pixels×5 pixels, and has two connecting regions E and F. In FIG. 18A, a value indicated in each pixel of the image 21 means a pixel value. In FIG. 18B, a numeral in parentheses which is allocated to each pixel of the image 21 means the index of the pixel. The index is calculated by a following equation based on a coordinate value (x, y);
 pixel index=x+y×(image size in the x direction).
The flowchart shown in FIG. 16 will be described. When the present process is started, the image data of the binary image 21 are first input (S1301). More specifically, the image data of the binary image 21 are loaded into the RAM 204. Next, the examination state flag 15 is created (S1302). That is, the working region of the examination state flag 15 is kept in a predetermined region on the RAM 204 and the examination-end flag constituting the examination state flag 15 is initialized to “0”. In the case in which the value of each pixel is indicated by a plurality of bits, the examination state flag 15 can also be implemented by assigning one of the plurality bits to the examination-end flag.
Next, the next examination candidate stack 17 and the connecting region stack 19 are created (S1303, S1304). More specifically, the working regions of the next examination candidate stack 17 and the connecting region stack 19 are held in predetermined regions on the RAM 204. Next, variables SIZEX, SIZEY and LABEL are initialized (S1305). Concretely, the variable SIZEX is set to the x-direction size value of the image 21 (for example, 5), the variable SIZEY is set to the y-direction size value of the image 21 (for example, 5), and the variable LABEL is set to 1. The variable LABEL gives a label number for labeling the connecting region. Next, a variable y for giving the y coordinate of an examination position is initialized to 0 (S1306), and furthermore, a variable x for giving the x coordinate of the examination position is initialized to 0 (S1307).
It is decided whether or not a pixel in a position (x, y) has not been examined yet but has been labeled (that is, a pixel value is “1”) (S1308). It can be decided whether or not the pixel has been examined, with referring to the examination state flag 15. Consequently, repetitive examinations can be avoided and a time required for the processing can be shorted. At this step, it is an object to first detect one of pixels constituting independent connecting regions in the image. A pixel detected at this step is referred to as “the first pixel of the connecting region”.
When the pixel has not been examined yet but has been labeled, that is, the fist pixel of the connecting region is detected, a connecting region detecting process for detecting and labeling a connecting region including the pixel (S1309) is carried out, and the routine then proceeds to a step S1311. The connecting region detecting process will be described below in detail. On the other hand, when the pixel has not been examined or labeled, the examination-end flag is set to ON for the examined pixel (S1310) and the routine then proceeds to a step S1311.
At the step S1311, the variable x is incremented to shift the examination position in the x direction, and then the variable x is compared with the variable SIZEX to decide whether or not the examination position is outside the range of the image 21 (S1312). When the examination position is within the range of the image 21 as a result of the decision, the routine returns to the step S1308 where the connecting region is examined in a new position shifted in the x direction in the same manner as described above. On the other hand, when a next examination position is outside the range of the image 21 as a result of the decision, the variable y is incremented to shift the examination position in the y direction (S1313), and the variable y is then compared with the variable SIZEY to decide whether or not the examination position is outside the range of the image 21 (S1314). When the examination position is within the range of the image 21, the routine returns to the step S1307 where the connecting region is examined in a new position on the left end of the image 21 which is lower by one line than the last position, in the same manner as described above at the last time. When the next examination position is outside the range of the image 21, the routine returns. The above-mentioned processes are repeated until the whole image 21 is examined and all the connecting regions included in the image are detected to be labeled (S1307 to S1314).
Next, the above-mentioned connecting region detecting process (step S1309) is described with reference to the flowchart shown in FIG. 17. In the process, it is examined whether or not a pixel on the periphery of the first pixel of the connecting region constitutes the same connecting region, thereby detecting and labeling the connecting region.
The index of the first pixel of the connecting region, that is, a pixel in a current coordinate position (x, y) is first pushed into the next examination candidate stack 17 (S1401), and is also pushed into the connecting region stack 19 (S1402). The examination-end flag of the first pixel of the connecting region is set ON (S1403). Then, the index is popped from the next examination candidate stack 17 (S1404). The coordinates of a pixel Pi (i=0 to 7) adjacent to a pixel indicated by the popped index are set respectively (S1405). The adjacent pixel Pi sequentially sets pixels of P0, P1, P2, . . . P7 counterclockwise from the lower left pixel in eight pixels around a central pixel (the pixel (a)) as shown in FIG. 19. Four pixels on the upper, lower, right and left portions to the central pixel may be set as the adjacent pixel.
Next, a variable i for specifying the adjacent pixel Pi is initialized to 0 (S1406). It is decided whether or not the adjacent pixel Pi indicated by the variable i has not been examined but has been labeled (S1407). When the pixel has not been examined but has been labeled, the index of the pixel Pi is pushed into both stacks 17 and 19 (S1408). Then, the examination-end flag of the pixel Pi is set ON (S1409). When the pixel has been examined or not labeled, these steps (S1408, S1409) are skipped. Thereafter, the variable i is incremented (S1410), and the above-mentioned process (S1406 to S1411) are repeated until the end of the examination whether or not all the adjacent pixels Pi are in the connecting region.
When the examination for all the adjacent pixels P1 is completed, the presence of an index stored in the next examination candidate stack 17 is confirmed (S1412). When there is an index stored in the stack 17, the routine returns to the step S1404. When there is no index stored in the next examination candidate stack 17, the routine proceeds to a step S1413. Thus, as long as the index is stored in the next examination candidate stack 17, the connecting region is examined for a pixel adjacent to the pixel indicated by the index stored in the next examination candidate stack 17 (S1404 to S1412). One connecting region is constituted by pixels indicated by all the indexes stored in the connecting region stack 19 when there is no index stored in the next examination candidate stack 17.
When there is no index stored in the next examination candidate stack 17, the presence of the index stored in the connecting region stack 19 is confirmed (S1413). When there is no index stored in the connecting region stack 19, the routine returns. When there is at least one index stored in the connecting region stack 19, all the indexes are popped from the connecting region stack 19 (S1414), and the pixel indicated by the popped index is labeled with a label number indicated by the variable LABEL (S1415). Then, the variable LABEL is incremented (S1416) and the routine then returns.
In the above-mentioned process, when the size of the connecting region is smaller than a predetermined value, the region may be treated as a noise so as not to carry out the labeling process. In that case, for example, a process for initializing the variable SUM to 0 is added in the step S1305 of the above flowchart, and furthermore a step for incrementing the variable SUM is added between the steps S1402 and S1403 and between the steps S1408 and S1409. Furthermore, the step S1413 is changed such that the variable SUM is compared with a predetermined value and the routine proceeds to the step S1414 when the variable SUM is equal to or greater than the predetermined value and the routine otherwise returns. The predetermined value, for example, is set to 8 in the case where a small connecting region having pixels less than eight pixels is to be ignored as a noise. Furthermore, when the connecting region size exceeds a maximum size, the labeling process of the connecting region can be terminated. At this time, the step S413 is changed such that the variable SUM is compared with a predetermined value (a maximum size value) and the routine returns when the variable SUM is equal to or greater than a predetermined value, or otherwise the routine proceeds to the step S1414. While the size of the connecting region is calculated by the number of pixels constituting the connecting region as described above, the size of the connecting region may be obtained from the sizes of the detected connecting region in the x and y directions.
FIGS. 20A to 20H are diagrams showing a change in the states of the examination state flag 15, the next examination candidate stack 17 and the connecting region stack 19 which are obtained when the image 21 is subjected to the connecting region labeling process according to the process of the flowchart described above.
FIG. 20A shows a state before the examination is started. All the examination-end flags are initialized to “0” and no index is stored in both stacks 17 and 19. FIG. 20B shows a state in which the first pixel of the connecting region is detected. The pixel of an index of “6” is detected as the first pixel in the connecting region, and the index of “6” is pushed into the next examination candidate stack 17 and the connecting region stack 19. FIG. 20C shows a state in which the index of “6” is popped from the next examination candidate stack 17 and the examination of the pixel P0 adjacent to the first pixel in the connecting region specified by the index is completed. The examination-end flag of the pixel P0 is set to “1”. FIG. 20D shows a state in which the examination of the adjacent pixel P1 is completed. The pixel P1 has not been examined yet but has been labeled. Therefore, the examination-end flag is set to “1” and an index of “11” of the pixel P1 is pushed into the next examination candidate stack 17 and the connecting region stack 19. FIG. 20E shows a state in which the examination of the adjacent pixel P2 is completed. The pixel P2 has not been examined but has been labeled. Therefore, the examination-end flag is set to “1” and an index of “12” of the pixel P2 is pushed into the next examination candidate stack 17 and the connecting region stack 19.
Then, after completion of the examination of all the adjacent pixels P0 to P7 of the first pixel in the connecting region, the next examination candidate stack 17 is popped to fetch the index of “12”. Thereafter, the pixel adjacent to the pixel specified by the index of “12” is examined. FIG. 20F shows a state in which the examination is completed for the pixel P0 adjacent to the pixel indicated by the index of “12”. The examination-end flag of the pixel P0 is set to “1”. FIG. 20G shows a state in which the examination is completed for the pixels P0 to P7 adjacent to the pixel indicated by the index of “12”. As a result of the examination of the adjacent pixel, the indexes of “13” and “8” are pushed into the next examination candidate stack 17 and the connecting region stack 19.
Then, the next examination candidate stack 17 is sequentially popped until it becomes empty, and pixels adjacent to the pixels indicated by the indexes popped from the next examination candidate stack 17 are examined. FIG. 20H shows a state in which the examination is completed for the pixel adjacent to the pixel indicated by the index of “11” which is popped from the next examination candidate stack 17. The index stored in the connecting region stack 19 at this time indicates a pixel constituting a first connecting region E of the image 21.
As described above, the image processing apparatus according to the present embodiment detects each of the independent connecting regions. Therefore, the label number is determined for the region simultaneous with the detection of the connecting region. Consequently, a work for re-labeling a label number according to the prior art is not required. Accordingly, the work for re-labeling the connecting region can be carried out by one-time image scanning. Therefore, it is not necessary to carry out the scanning twice differently from the prior art, and the labeling process can be executed at a high speed. Moreover, the connecting region is detected one by one for the labeling. Therefore in the environment capable of executing the parallel processing, in the image labeling process, for example, a predetermined process including image recognition process can be carried out for the detected one connecting region, while the other connecting regions can be detected.
Numerous modifications and alternative embodiments of the invention will be apparent to those skilled in the art in view of the foregoing description. Accordingly, this description is to be construed as illustrative only, and is provided for the purpose of teaching those skilled in the art the best mode of carrying out the invention. The details of the structure and/or function may be varied substantially without departing from the spirit of the invention and all modifications which come within the scope of the appended claims are reserved.

Claims (11)

What is claimed is:
1. An image processing apparatus for labeling a connecting region of an image comprising:
a flag memory including flags indicating whether or not each pixel of the image is finished with examination;
a candidate stack for storing indexes, each of indexes specifying a pixel of which adjacent pixels are to be examined;
a region memory for storing indexes, each of indexes indicating one of pixels constituting the connecting region;
a labeling unit for labeling pixels specified by the indexes stored in the region memory; and
a connecting region detecting unit for detecting the pixels constituting the connecting region, the connecting region detecting unit scanning the image, examining whether or not a pixel of the image constitutes the connecting region, setting the flag for the examined pixel, and when the pixel constituting the connecting region is first detected in the examination,
(1) pushing an index of the detected pixel into the candidate stack, and storing the index to the region memory,
(2) while there is at least one index stored in the candidate stack, repeating a) popping an index from the candidate stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the candidate stack and storing the indexes to the region memory, and c) setting the flags for the adjacent pixels.
2. The apparatus according to claim 1, wherein the labeling unit calculates a size of a region including pixels indicated by indexes stored in the memory when labeling pixels, and prohibits labeling the pixel when the calculated size is out of a predetermined range.
3. The apparatus according to claim 1, wherein the region memory comprises a stack memory.
4. An image processing method for labeling a connecting region of an image comprising:
scanning the image;
examining whether or not a pixel of the image constitutes the connecting region;
setting the flag for the examined pixel; and
when the pixel constituting the connecting region is first detected in the examination,
(1) pushing an index of the detected pixel into a stack, and storing the index to a memory,
(2) while there is at least one index stored in the stack, repeating a) popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) setting the flags for the adjacent pixels,
(3) reading all indexes out from the memory when there is no index stored in the stack to label pixels indicated by the read indexes with same number.
5. The method according to claim 4, wherein reading all indexes comprises calculating a size of a region including a pixel indicated by an index stored in the memory, and prohibiting labeling the pixel when the calculated size is out of a predetermined range.
6. The method according to claim 4, wherein the memory comprises a stack memory.
7. A computer readable recording medium which stores a program for controlling the computer to label a connecting region of an image, the program being capable of executing the following functions:
a function for scanning the image;
a function for examining whether or not a pixel of the image constitutes the connecting region;
a function for setting the flag for the examined pixel; and
when the pixel constituting the connecting region is first detected in the examination,
(1) a function for pushing an index of the detected pixel into a stack, and storing the index to a memory,
(2) while there is at least one index stored in the stack, repeating a) a function for popping an index from the stack to examine whether or not pixels adjacent to a pixel which is indicated by the popped index and of which flag indicates unexamined constitute the connecting region, b) when the adjacent pixels constitute the connecting region, a function for pushing the indexes of the adjacent pixels into the stack and storing the indexes to the memory, and c) a function for setting the flags for the adjacent pixels,
(3) a function for reading all indexes out from the memory when there is no index stored in the stack to label pixels indicated by the read indexes with same number.
8. The medium according to claim 7, wherein a function for reading all indexes calculates a size of a region including a pixel indicated by an index stored in the memory, and prohibits labeling the pixel when the calculated size is out of a predetermined range.
9. The medium according to claim 7, wherein the memory comprises a stack memory.
10. An image processing method for labeling a connecting region of an image comprising:
determining whether or not a pixel of the image is examined;
determining whether or not an adjacent pixel of the unexamined pixel is unexamined and constitutes the continuous connecting region;
pushing an index of the adjacent pixel into an examination candidate stack, and storing the index to a connecting region memory, when the adjacent pixel of the unexamined pixel is determined to be unexamined and constitute the continuous connecting region;
popping the index from the stack to repeat the above determining, pushing and storing to the pixel indicated by the popped index; and
reading all indexes out from the memory to label pixels indicated by the read indexes with the same number, when there is no index stored in the stack.
11. The method according to claim 10, further comprising while shifting the position of the pixel to be examined, detecting the connecting region for the unexamined pixel to label the detected connecting region.
US10/369,809 1999-03-31 2003-02-21 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon Expired - Fee Related US6731801B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/369,809 US6731801B2 (en) 1999-03-31 2003-02-21 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP11092094A JP2000285240A (en) 1999-03-31 1999-03-31 Image processor, image processing method and recording medium in which image processing program is recorded
JP11-092094 1999-03-31
JP11-092050 1999-03-31
JP11092050A JP2000285237A (en) 1999-03-31 1999-03-31 Method and device for picture processing and recording medium with picture processing program recorded
US09/538,735 US6643400B1 (en) 1999-03-31 2000-03-30 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon
US10/369,809 US6731801B2 (en) 1999-03-31 2003-02-21 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/538,735 Division US6643400B1 (en) 1999-03-31 2000-03-30 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon

Publications (2)

Publication Number Publication Date
US20030156757A1 US20030156757A1 (en) 2003-08-21
US6731801B2 true US6731801B2 (en) 2004-05-04

Family

ID=29272122

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/538,735 Expired - Fee Related US6643400B1 (en) 1999-03-31 2000-03-30 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon
US10/369,809 Expired - Fee Related US6731801B2 (en) 1999-03-31 2003-02-21 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/538,735 Expired - Fee Related US6643400B1 (en) 1999-03-31 2000-03-30 Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon

Country Status (1)

Country Link
US (2) US6643400B1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050152582A1 (en) * 2003-11-28 2005-07-14 Samsung Electronics Co., Ltd. Multiple person detection apparatus and method
US20070053610A1 (en) * 2005-09-06 2007-03-08 Canon Kabushiki Kaisha Image processing apparatus and control method therefor
US20100021061A1 (en) * 2008-07-28 2010-01-28 Richard John Campbell Methods and Systems for Connected-Component Labeling
US20110142303A1 (en) * 2009-12-10 2011-06-16 Canon Kabushiki Kaisha Image labeling using parallel processing
US20110164818A1 (en) * 2004-08-20 2011-07-07 Fuji Xerox Co., Ltd. Method of generating a labeled image and image processing system with pixel blocks
US20130028524A1 (en) * 2011-07-29 2013-01-31 Brother Kogyo Kabushiki Kaisha Image processing device identifying region in image as one of uniform region and nonuniform region
US20130148847A1 (en) * 2011-12-13 2013-06-13 Xerox Corporation Post-processing a multi-spectral image for enhanced object identification
KR101394493B1 (en) 2013-02-28 2014-05-14 한국항공대학교산학협력단 Single-pass labeler without label merging period
US8792719B2 (en) 2011-07-29 2014-07-29 Brother Kogyo Kabushiki Kaisha Image processing device determining attributes of regions
US8830529B2 (en) 2011-07-29 2014-09-09 Brother Kogyo Kabushiki Kaisha Image processing device for accurately identifying region in image without increase in memory requirement
US8837836B2 (en) 2011-07-29 2014-09-16 Brother Kogyo Kabushiki Kaisha Image processing device identifying attribute of region included in image

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7016538B2 (en) * 2000-04-27 2006-03-21 Minolta Co., Ltd. Image processor for detecting specified pattern
US20030229446A1 (en) * 2002-06-06 2003-12-11 Boscamp Robert L. Mobile education and entertainment system, method and device
GB0321169D0 (en) * 2003-09-10 2003-10-08 Hewlett Packard Development Co Methods and apparatus for generating images
US7594177B2 (en) * 2004-12-08 2009-09-22 Microsoft Corporation System and method for video browsing using a cluster index
JP4859390B2 (en) * 2005-05-10 2012-01-25 株式会社リコー Image processing method and image processing apparatus
JP4640155B2 (en) * 2005-12-15 2011-03-02 ソニー株式会社 Image processing apparatus and method, and program
JP4921202B2 (en) * 2006-03-15 2012-04-25 キヤノン株式会社 Job history management system, control method therefor, program, and storage medium
US8213734B2 (en) * 2006-07-07 2012-07-03 Sony Ericsson Mobile Communications Ab Active autofocus window
JP2009093638A (en) * 2007-09-20 2009-04-30 Canon Inc Image detection device and image detection method
JP4900175B2 (en) * 2007-10-04 2012-03-21 セイコーエプソン株式会社 Image processing apparatus and method, and program
JP5574816B2 (en) 2010-05-14 2014-08-20 キヤノン株式会社 Data processing apparatus and data processing method
JP5618670B2 (en) * 2010-07-21 2014-11-05 キヤノン株式会社 Data processing apparatus and control method thereof
US9792363B2 (en) * 2011-02-01 2017-10-17 Vdopia, INC. Video display method
US9117262B2 (en) 2012-11-30 2015-08-25 Adobe Systems Incorporated Learned piece-wise patch regression for image enhancement
US9122960B2 (en) * 2012-11-30 2015-09-01 Adobe Systems Incorporated Patch size adaptation for image enhancement
AU2013248213A1 (en) * 2013-10-24 2015-05-14 Canon Kabushiki Kaisha Method, apparatus and system for generating an attribute map for processing an image
GB2521157B (en) * 2013-12-10 2020-12-16 Network Mapping Uk Ltd Infrastructure positioning
JP6352133B2 (en) * 2014-09-26 2018-07-04 株式会社Screenホールディングス Position detection apparatus, substrate processing apparatus, position detection method, and substrate processing method
JP6529315B2 (en) * 2015-04-09 2019-06-12 キヤノン株式会社 Main subject detection method, main subject detection device and program
WO2016181550A1 (en) * 2015-05-14 2016-11-17 オリンパス株式会社 Image-processing device
JP6506153B2 (en) * 2015-10-27 2019-04-24 株式会社Screenホールディングス Displacement detection device, displacement detection method, and substrate processing apparatus

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4334241A (en) * 1979-04-16 1982-06-08 Hitachi, Ltd. Pattern position detecting system
US4783838A (en) 1984-12-26 1988-11-08 Konishiroku Photo Industry Co., Ltd. Image processing method and apparatus therefor
US5199083A (en) * 1990-07-30 1993-03-30 Hitachi, Ltd. Image data processing method and system for giving identification labels to areas of connected black picture elements
US5325445A (en) * 1992-05-29 1994-06-28 Eastman Kodak Company Feature classification using supervised statistical pattern recognition
US5572602A (en) 1993-02-25 1996-11-05 Fujitsu Limited Image extraction system for extracting patterns such as characters, graphics and symbols from image having frame formed by straight line portions
US5602940A (en) 1993-04-09 1997-02-11 Mitsubishi Denki Kabushiki Kaisha Method of labeling image data and device for the same using temporary labels having a concentration relationship
US5625717A (en) 1992-06-24 1997-04-29 Mitsubishi Denki Kabushiki Kaisha Image processing device for processing grey level images
US5696838A (en) * 1993-04-27 1997-12-09 Sony Corporation Pattern searching method using neural networks and correlation
US5841903A (en) * 1992-01-17 1998-11-24 Yamaha Corporation Method and device for extracting a connected component of image data
US5867593A (en) * 1993-10-20 1999-02-02 Olympus Optical Co., Ltd. Image region dividing apparatus
US6104832A (en) 1992-09-11 2000-08-15 Canon Kabushiki Kaisha Image processing method and apparatus
US6302329B1 (en) * 1994-12-27 2001-10-16 Sharp Kabushiki Kaisha Method and apparatus for recording digital information in two-dimensional pattern, carrier of digital information recorded thereby, and a method and apparatus for decoding such digital information
US6307963B1 (en) * 1997-06-02 2001-10-23 Matsushita Electric Industrial Co., Ltd. Image detection method, image detection apparatus, image processing method, image processing apparatus, and medium
US6320977B1 (en) * 1990-04-04 2001-11-20 Matsushita Electric Industrial Co., Ltd Method and apparatus for positional detection using pattern matching process
US6453069B1 (en) * 1996-11-20 2002-09-17 Canon Kabushiki Kaisha Method of extracting image from input image using reference image

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4334241A (en) * 1979-04-16 1982-06-08 Hitachi, Ltd. Pattern position detecting system
US4783838A (en) 1984-12-26 1988-11-08 Konishiroku Photo Industry Co., Ltd. Image processing method and apparatus therefor
US6320977B1 (en) * 1990-04-04 2001-11-20 Matsushita Electric Industrial Co., Ltd Method and apparatus for positional detection using pattern matching process
US5199083A (en) * 1990-07-30 1993-03-30 Hitachi, Ltd. Image data processing method and system for giving identification labels to areas of connected black picture elements
US5841903A (en) * 1992-01-17 1998-11-24 Yamaha Corporation Method and device for extracting a connected component of image data
US5325445A (en) * 1992-05-29 1994-06-28 Eastman Kodak Company Feature classification using supervised statistical pattern recognition
US5625717A (en) 1992-06-24 1997-04-29 Mitsubishi Denki Kabushiki Kaisha Image processing device for processing grey level images
US6104832A (en) 1992-09-11 2000-08-15 Canon Kabushiki Kaisha Image processing method and apparatus
US5572602A (en) 1993-02-25 1996-11-05 Fujitsu Limited Image extraction system for extracting patterns such as characters, graphics and symbols from image having frame formed by straight line portions
US5602940A (en) 1993-04-09 1997-02-11 Mitsubishi Denki Kabushiki Kaisha Method of labeling image data and device for the same using temporary labels having a concentration relationship
US5696838A (en) * 1993-04-27 1997-12-09 Sony Corporation Pattern searching method using neural networks and correlation
US5867593A (en) * 1993-10-20 1999-02-02 Olympus Optical Co., Ltd. Image region dividing apparatus
US6302329B1 (en) * 1994-12-27 2001-10-16 Sharp Kabushiki Kaisha Method and apparatus for recording digital information in two-dimensional pattern, carrier of digital information recorded thereby, and a method and apparatus for decoding such digital information
US6453069B1 (en) * 1996-11-20 2002-09-17 Canon Kabushiki Kaisha Method of extracting image from input image using reference image
US6307963B1 (en) * 1997-06-02 2001-10-23 Matsushita Electric Industrial Co., Ltd. Image detection method, image detection apparatus, image processing method, image processing apparatus, and medium

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050152582A1 (en) * 2003-11-28 2005-07-14 Samsung Electronics Co., Ltd. Multiple person detection apparatus and method
US20110164818A1 (en) * 2004-08-20 2011-07-07 Fuji Xerox Co., Ltd. Method of generating a labeled image and image processing system with pixel blocks
US8208728B2 (en) * 2004-08-20 2012-06-26 Fuji Xerox Co., Ltd. Method of generating a labeled image and image processing system with pixel blocks
US20070053610A1 (en) * 2005-09-06 2007-03-08 Canon Kabushiki Kaisha Image processing apparatus and control method therefor
US7738696B2 (en) * 2005-09-06 2010-06-15 Canon Kabushiki Kaisha Image processing apparatus and method for extracting a document area
US20100021061A1 (en) * 2008-07-28 2010-01-28 Richard John Campbell Methods and Systems for Connected-Component Labeling
US8300945B2 (en) 2008-07-28 2012-10-30 Sharp Laboratories Of America, Inc. Methods and systems for connected-component labeling
US8600171B2 (en) * 2009-12-10 2013-12-03 Canon Kabushiki Kaisha Image labeling using parallel processing
US20110142303A1 (en) * 2009-12-10 2011-06-16 Canon Kabushiki Kaisha Image labeling using parallel processing
US20130028524A1 (en) * 2011-07-29 2013-01-31 Brother Kogyo Kabushiki Kaisha Image processing device identifying region in image as one of uniform region and nonuniform region
US8792719B2 (en) 2011-07-29 2014-07-29 Brother Kogyo Kabushiki Kaisha Image processing device determining attributes of regions
US8830529B2 (en) 2011-07-29 2014-09-09 Brother Kogyo Kabushiki Kaisha Image processing device for accurately identifying region in image without increase in memory requirement
US8837836B2 (en) 2011-07-29 2014-09-16 Brother Kogyo Kabushiki Kaisha Image processing device identifying attribute of region included in image
US8929663B2 (en) * 2011-07-29 2015-01-06 Brother Kogyo Kabushiki Kaisha Image processing device identifying region in image as one of uniform region and nonuniform region
US20130148847A1 (en) * 2011-12-13 2013-06-13 Xerox Corporation Post-processing a multi-spectral image for enhanced object identification
US8818030B2 (en) * 2011-12-13 2014-08-26 Xerox Corporation Post-processing a multi-spectral image for enhanced object identification
KR101394493B1 (en) 2013-02-28 2014-05-14 한국항공대학교산학협력단 Single-pass labeler without label merging period

Also Published As

Publication number Publication date
US20030156757A1 (en) 2003-08-21
US6643400B1 (en) 2003-11-04

Similar Documents

Publication Publication Date Title
US6731801B2 (en) Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon
US6466694B2 (en) Document image processing device and method thereof
US9239946B2 (en) Method and apparatus for detecting and processing specific pattern from image
EP0738987B1 (en) Processing machine readable forms
EP0657831A1 (en) Image retrieving method and apparatus
US7102649B2 (en) Image filling method, apparatus and computer readable medium for reducing filling process in processing animation
JP2001109895A (en) Processing method for digital images
EP0831421B1 (en) Method and apparatus for retouching a digital color image
JPH06195513A (en) Method for extraction of bend point of optical character recognition system
JP2001101426A (en) Digital image processing method
EP1202213A2 (en) Document format identification apparatus and method
JP3718967B2 (en) Image feature amount extraction apparatus and method, and recording medium on which image feature amount extraction program is recorded
US6788829B1 (en) Image processing apparatus and method for recognizing specific pattern and recording medium having image processing program recorded thereon
US6356657B1 (en) Image processing method and apparatus for discriminating an input image area or an attribute of image information in the discriminated area
JP2000285237A (en) Method and device for picture processing and recording medium with picture processing program recorded
JP2004038530A (en) Image processing method, program used for executing the method and image processor
JP2004080341A (en) Image processor, image processing method, program, and recording medium
US6212294B1 (en) Image processing method and apparatus therefor
US5894525A (en) Method and system for simultaneously recognizing contextually related input fields for a mutually consistent interpretation
JP2000285240A (en) Image processor, image processing method and recording medium in which image processing program is recorded
JPH10336428A (en) Image processing unit
JPH11316797A (en) Method and device for discriminating area of document image
JP3186712B2 (en) Document reading device
JP2838556B2 (en) Image processing device
JP2002157550A (en) Device and method for recognizing character and recording medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20160504