GB2519130A - A method and apparatus for image segmentation - Google Patents

A method and apparatus for image segmentation Download PDF

Info

Publication number
GB2519130A
GB2519130A GB1317979.1A GB201317979A GB2519130A GB 2519130 A GB2519130 A GB 2519130A GB 201317979 A GB201317979 A GB 201317979A GB 2519130 A GB2519130 A GB 2519130A
Authority
GB
United Kingdom
Prior art keywords
superpixels
distance
seeds
superpixel
digital image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1317979.1A
Other versions
GB201317979D0 (en
Inventor
Tinghuai Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to GB1317979.1A priority Critical patent/GB2519130A/en
Publication of GB201317979D0 publication Critical patent/GB201317979D0/en
Publication of GB2519130A publication Critical patent/GB2519130A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A set of seeds are associated with an object in a digital image 302, where the set of seeds comprises a plurality of superpixels classified as being associated with the object. A seed expansion processor 30 expands the set of seeds by determining at least two distance measures, where each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels. The distance measures are ranked by a ranking processor 305 in order of the magnitude, and at least one of the two superpixels is classified as a superpixel of the set of seeds according to the rank order. The distance measures may be a combination of Euclidean or spatial distance between centroids of the super pixels and an appearance distance between histograms of a colour channel. User interaction 304 may be used to classify an initial set of seeds as associated with a foreground object.

Description

Intellectual Property Office Applicathin No. GB1317979.I RTM Dac-.9 April 2014 The following terms are registered trade marks and should he rcad as such wherever they occur in this document: Synopsys Cadence Inlelleclual Properly Office is an operaling name of the Pateni Office www.ipo.gov.uk A Method and Apparatus for Image Segmentation
Field of the Application
The present application relates to video and image, and in particular, but not exclusively to, the segmentation of video and images.
Background
Segmentation of coloured textured images may be used in many image analysis and multimedia applications such as content based image retrieval, object recognition and 3D modelling.
One application of image segmentation is to extract a foreground object (or object of interest) out of an image which may have a cluttered background, thereby enabling the extracted foreground object to be composed onto a further image without being encumbered with the visual artefacts of the original background.
For complex images there can be more than one interpretation of the foreground object of interest, thus making the task of extracting it ill-posed and ambiguous. In order to overcome any potential ambiguity in identifying and extracting the foreground object image segmentation systems can incorporate a level of user interaction into the process. For instance, interactive image segmentation requires a user to provide marker or scribble in order to label regions of interest which are then automatically segmented.
However, such interactive image segmentation schemes often require a user to iteratively add more scribbles or markers to achieve the desired result, resulting in a process which is often tedious and cumbersome especially if the user is making the adjustments on a compact touchscreen device.
Summary of various examples
This application therefore proceeds from the consideration that whilst a user can iteratively add more scribbles to an image in order to achieve the desired result of locating the various regions of interest in an image, it is preferable to minimize as much as possible the number of iterations especially when using a compact touchscreen device.
There is provided according to a first aspect a method comprising: expanding a set of seeds associated with an object in a digital image, the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, the set of seeds is expanded by: determining at least two distance measures, each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; ranking the at least two distance measures in order of the magnitude of their values; and classifying at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
Ranking the at least two distance measures in order of the magnitude of their values may comprise arranging the at least two distance measures in ascending order of magnitude.
Classifying the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures may comprise: determining that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and selecting the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
Each of the at least two distance measures may comprise a combination of a first distance metric and a second distance metric.
The first distance metric may represent the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
The spatial distance may be a Euclidean distance.
The second distance metric may represent an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
The superpixel may comprise a plurality of pixels of the digital image, and the feature descriptor of a superpixel may comprise a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
The colour channel value may be a pixel colour channel value given by a colour channel from the Commission internationale de l'éclairage V a* b* colour space.
The object in the digital image may be a foreground object, and the set of seeds may comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
The method may further comprise repeating the method described herein, the object in the digital image may be a background object, the set of seeds may comprise a plurality of superpixels of the digital image classified as being associated with the background object, and each of the at least two distance measures may be a combination of a distance measure between each of at least two superpixels of a further plurality of unclassified superpixels and the set of background seeds and a distance measure between each of the at least two superpixels of the further plurality of unclassified superpixels and the set of foreground seeds.
According to a second aspect of there is provided an apparatus comprising at least one processor and at least one memory including computer program code for one or more programs, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to expand a set of seeds associated with an object in a digital image, the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, the set of seeds is expanded by that apparatus being further caused to: determine at least two distance measures, each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; rank the at least two distance measures in order of the magnitude of their values; and classify at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
The apparatus caused to rank the at least two distance measures in order of the magnitude of their values may be further caused to arrange the at least two distance measures in ascending order of magnitude.
The apparatus caused to classify the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures may be further caused to: determine that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and select the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
Each of the at least two distance measures may comprise a combination of a first distance metric and a second distance metric.
The first distance metric may represent the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
The spatial distance may be a Euclidean distance.
The second distance metric may represent an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
The superpixel may comprise a plurality of pixels of the digital image, and the feature descriptor of a superpixel may comprise a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
The colour channel value may be a pixel colour channel value given by a colour channel from the Commission internationale de l'eclairage V at bt colour space.
The object in the digital image may be a foreground object, and the set of seeds may comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
There is according to a third aspect an apparatus configured to: expand a set of seeds associated with an object in a digital image, the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, the set of seeds is expanded by the apparatus being configured to: determine at least two distance measures, each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; rank the at least two distance measures in order of the magnitude of their values; and classify at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
The apparatus configured to rank the at least two distance measures in order of the magnitude of their values may be further configured to arrange the at least two distance measures in ascending order of magnitude.
The apparatus configured to classify the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures may be further configured to: determine that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and select the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
Each of the at least two distance measures may comprise a combination of a first distance metric and a second distance metric.
The first distance metric may represent the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
The spatial distance may be a Euclidean distance.
The second distance metric may represent an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
The superpixel may comprise a plurality of pixels of the digital image, and the feature descriptor of a superpixel may comprise a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
The colour channel value may be a pixel colour channel value given by a colour channel from the Commission internationale de l'eclairage L* a* b* colour space.
The object in the digital image may be a foreground object, and the set of seeds may comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
A computer program product may comprise at least one computer readable non-transitory medium having program code stored thereon, the program code, when executed by an apparatus, causing the apparatus at least to perform the method as described herein.
An electronic device may comprise apparatus as described herein.
A chipset may comprise apparatus as described herein.
For a better understanding of the present application and as to how the same may be carried into effect, reference will now be made by way of example to the accompanying drawings in which:
Summary of Figures
Figure 1 shows a schematic representation of an apparatus suitable for implementing some example embodiments; Figure 2 shows a schematic representation of an image segmentation system suitable for employing some embodiments; Figure 3 shows a schematically a seed expansion engine according to some embodiments; Figure 4 shows a flow diagram of the operation of the seed expansion processor; and Figure 5 shows a flow diagram further detailing some processes carried out by the seed expansion processor.
Embodiments of the Ailication The application describes apparatus and methods for segmenting an image into various regions. The embodiments described hereafter may be utilised in various applications and situations where images or video are segmented into various regions. For example, such applications and situations may include segmenting an image into foreground and background objects of interest and using image or video matting techniques to form a further image or video comprising
foreground/background components.
The following describes apparatus and methods for segmenting an image or video into foreground and background objects. In this regard reference is first made to Figure 1 which discloses a schematic block diagram of an exemplary electronic device 10 or apparatus. The electronic device is configured to perform interactive image segmentation according to some embodiments of the application.
The electronic device 10 is in some embodiments a mobile terminal, mobile phone or user equipment for operation in a wireless communication system. In other embodiments, the electronic device 10 may be an audio video device such as a video camera, media recorder, media player or a digital camera, or any other personal electronic device suitable for processing image based signals In other embodiments the apparatus and methods for performing interactive image segmentation may be performed on a computer system or a display device such as a television or monitor based system comprising one or more processors implemented using any desired architecture or chip set.
The electronic device 10 comprises a processor 15 which is coupled to a display 12. The processor 15 is further coupled to a transceiver (TX/RX) 13, to a user interface (UI) 14 and to a memory 16. The processor 15 may also be coupled to a graphics component/module 20 which may include a graphic processing unit (GPU)21.
The processor 15 may be configured to execute various program codes 17. The program codes 17, in some embodiments, comprise image or video segmentation code. The implemented program codes 17 in some embodiments further comprise additional code for further processing of images. The implemented program codes 17 may in some embodiments be stored for example in the memory 16 for retrieval by the processor 15 whenever needed. The memory 15 in some embodiments may further provide a section 18 for storing data, for example data that has been processed in accordance with the application. The program codes 17 may also comprise operating system code which may at least enable operation of software in the electronic device 10.
The user interface 14 in some embodiments enables a user to input commands to the electronic device 10, for example via a keypad, user operated buttons or switches or by a touch interface on the display 12. Furthermore the user may in some embodiments obtain information from the electronic device 10, for example via the display 12 of the operation of the apparatus 10. In some other embodiments the user may be informed of operations by a sound or audio sample via a speaker (not shown).
The user interface 14 in some embodiments may initiate, interact with, direct, and/or control a video image segmentation method performed by the graphics module 20. The graphics module 20 may obtain a set of digital video images and optionally with user input via the user interface 14 analyse and perform image segmentation on the image or video frame.
The transceiver 13 enables communication with other electronic devices, for example in some embodiments via a wireless communication network.
It is to be understood again that the structure of the electronic device 10 could be supplemented and varied in many ways.
A user of the electronic device 10 may use a camera and video module (not shown) for capturing images or video frames which may be stored in the data section 18 of the memory 16. An application in some embodiments may be activated by the user via the user interface 14. This application, which may in some embodiments be run by the processor 15, causes the processor 15 to execute the code stored in the memory 16.
It would be appreciated that the schematic structures described in Figures 2 and 3, and the method steps shown in Figures 3 and 4 represent only a part of the operation of video image segmentation and specifically part of an apparatus or method for interactive image segmentation as exemplarily implemented in the apparatus shown in Figure 1.
The general operation of video image segmentation as employed by embodiments is shown in Figure 2. General interactive image segmentation systems comprise an image segmentation engine (or image segmentation processor) 102, as illustrated schematically in Figure 2.
The image segmentation processor 102 can segment an input image 110 into a segmented image of foreground and background objects 114. The segmentation process can be performed with the intervention of a user 112 for providing prior indications of foreground and background regions or objects. The user intervention 112 may take the form of drawing rough scribbles on a touchscreen display in order to indicate the various regions within the image. In the exemplarily electronic device depicted In Figure 10, the rough scribbles may be hand drawn by the user via a touchscreen UI 14.
The concept for the embodiments as described herein is to firstly convert the rigid grid like pixel structure of the image into perceptual and meaningful atomic entities known as superpixels. The various superpixels in the image may then by initially classified as either foreground or background superpixels in accordance with the scribbles as entered by the user. The initially classified groups of foreground and background superpixels may then be used as seeds for automated further classification of the foreground and background superpixels. The new foreground and background seeds may then be used in further stages of the image segmentation process. To that respect reference is made to Figure 3 wherein it is depicted a seed expansion processor 30 which may form part of the image segmentation processor 102 according to some embodiments.
In some embodiments the seed expansion processor 30 may comprise a superpixel extractor 301, a superpixel selector 303 and a superpixel ranking processor 305.
The operation of these components is described in more detail with reference to the flow chart in Figure 4 showing the operation of the image segmentation processor 102.
The input to the image segmentation processor 102 may be connected to the input of the seed expansion processor 30 along which a digital image 302 may be received by a superpixel extractor 301.
In embodiments the superpixel extractor 301 may be arranged to segment the digital image 302 into small contiguous and perceptually similar regions of pixels, otherwise known as superpixels. In other words the superpixel extractor 301 may be configured to implement a superpixel algorithm which can group pixels of a digital image into perceptually meaningful atomic regions, thereby replacing the rigid grid like structure of the pixel representation of the digital image 302.
Converting the rigid pixel structure of the digital image 302 into regions of perceptually similar pixels can have the advantage of capturing any redundancy in the image as well as providing a convenient primitive from which to compute image features. Furthermore, since the structure of a superpixel mapped image is less rigid than the pixel grid form of the digital image from which it is derived, the superpixel based image can have the further advantage of reducing the complexity of subsequent image processing steps.
In a first group of embodiments the digital image 302 may be mapped by the superpixel extractor 301 into superpixels by using the method of Simple Linear Iterative Clustering (SLIC). The method may comprise initially converting the digital image 302 into the Commission internationale de leclairageL* a* b* colour space followed by clustering procedure in which the converted digital image 302 is sampled on a regular grid which is spaced at a number of pixels apart. This initial step may form a number of initial regular clusters whose centres lie on the newly formed grid. The centres may then be moved to seed locations corresponding to the lowest gradient position in a 3x3 neighbourhood. Each pixel may then be associated with the nearest cluster centre whose search region overlaps the pixel's location. This may be performed by using a distance measure which may be used to determine the nearest cluster centre for each pixel. Once each pixel has been assigned to its respective nearest cluster centre, an update step may be performed in order to adjust each cluster centre to the mean vector of all pixels belonging to the cluster. In some embodiments the above assignment step may be repeated iteratively until an error measure converges.
Further details of the above superpixel extraction method may be found in the lEEEpublication Transactions on Pattern Analysis and Machine Vision, Vol. 34, No. 11 entitled "SLIC Superpixels Compared to State-of-the-Art Superpixel Methods," by Rahnakrishna Achanta, Appu Shaji, Kevein Smith, Aurelien Lucchi, Pascal Fua and Sabine Susstraunk.
It is to be understood that other embodiments may use different methods of extracting superpixels, For example some embodiments, a gradient ascent based algorithmic approach in forming the superpixels may be deployed. In this approach the algorithm may initially start from a rough initial clustering of pixels and the iteratively refine the clusters further by using a gradient ascent method to until a convergence criterion is reached. In another example some embodiments may adopt a graph based approach in which each pixel is treated as a node on a graph.
It is to be further appreciated that other embodiments may convert pixels of the digital image 302 into alternative colour spaces which also approximate human vision such CIE Lab, CIE L*u*v*.
The step of extracting and partitioning the digital image 302 into superpixels is shown as processing step 401 in Figure 4 After the digital image 302 has been partitioned into superpixels the superpixel extractor 301 may then derive a feature descriptor for each extracted superpixel of the digital image 302.
The feature descriptor for each superpixel may be derived by determining its Commission Internationale de l'eclairage (CIE) [* a* b* colour histogram.
In embodiments, the CIE Lab colour histogram may be determined by dividing each of the L*, a* and b* colour channels of each pixel of the digital image 302 into a number of bins, where each bin is assigned a range of non-overlapping colour channel intensity values. The histogram is then formed by populating each channel bin with a pixel whose channel value lies within the range of values for the particular bin.
In other words for all the pixels assigned to a particular superpixel the L*, a* and bt colour channels of the pixel CIE L*a*b* colour space may be divided into a number of bins, and each bin is then assigned a range of colour channel intensity values.
The histogram is then formed by populating each channel bin with those pixels assigned to the superpixel whose channel value lies within the range of values for the particular bin.
It is to be understood that each superpixel feature descriptor is derived from the CIE L*a*b* transformed pixels collectively assigned to the particular superpixel in question.
In a first group of embodiments each colour channel may be divided into twenty bins.
However, it is to be appreciated that other embodiments may adopt a different number of bins per colour channel.
The step of determining a feature descriptor for each superpixel by determining the CIE L*a*b* histogram of the pixels assigned to each superpixel is shown as processing step 403 in Figure 4.
The output from the superpixel extractor 301 may be arranged to be connected to the input of the superpixel selector 303. In other words, the superpixel selector 303 may be configured to receive as input the superpixels for the digital image 301 as determined by the superpixel extractor 301. The superpixel selector 303 may also be arranged to receive a further input 304 along which information relating to the interactions of a user may be passed.
In embodiments the superpixel selector may be configured to provide an initial selection of superpixels associated with a foreground object and an initial selection of superpixels associated with the background of the digital image 302.
In embodiments this initial selection may be performed with the interaction of a user, whereby the user selects roughly indicates a foreground object of interest. In some embodiments the selected foreground object may be indicated by the user physically drawing a scribble over the area of the digital image 302 having the foreground object of interest. For example, in the exemplarily apparatus 10 depicted in Figure 1, the user may select the object of interest of by indicating the appropriate area of the digital image on the touch sensitive screen 12.
It is to be appreciated that the information associated with the selected foreground object may be passed to the superpixel selector 303 via the input 304.
The superpixel selector 303 may, upon receiving the user's initial indication of the foreground object of interest, classify the superpixels in the region of the indication on the digital image 302 as the initial set of superpixels associated with the foreground object.
Additionally, in embodiments the superpixel selector 303 may initially classify the superpixels positioned around the border of the digital image 302 as the initial set
of background object superpixels.
The initial set of superpixels associated with the foreground object may form the initial set of foreground object seeds, and the initial set of superpixels associated with the background of the digital image 302 may form the initial set of background seeds.
The step of determining the initial set of foreground object seeds and the initial set of background seeds through user interaction is shown as processing step 405 in Figure 4.
The initial set of foreground object seeds and the initial set of background seeds may be passed to the superpixel ranking processor 305. Furthermore the superpixel selector 303 may also be arranged to pass the superpixels which have not been assigned to either the initial set of foreground object seeds or the initial set of background seeds, in other words unlabelled superpixels.
Additionally in embodiments the superpixel ranking processor 305 may be further arranged to receive the superpixel feature descriptors associated with the superpixels of the digital image 302. This may be depicted in Figure 3 as the connection 306 from the superpixel extractor 301 to the superpixel ranking processor 305.
The superpixel ranking processor 305 may then expand the number of superpixels classified as foreground seeds by first classifying and then assigning a number of the unlabelled superpixels to the current set of superpixels which have been previously as foreground seeds.
Furthermore the superpixel ranking processor 305 may also expand the number of superpixels classified as background seeds. This may be achieved by first classifying and then assigning a number of the superpixels which have not been previously classified as foreground seeds to the set of background seeds.
The operation of the superpixel ranking processor 305 will hereafter be described in more detail by reference to the processing steps in Figure 5.
In embodiments the process of classifying and assigning superpixels to the current set of foreground seeds may be dependent upon a distance metric between the centroid of an unclassified superpixel and a centroid of a foreground seed.
However, it is to be understood that the set of foreground superpixels comprises a number of superpixels each having its own centroid. Accordingly, a distance metric may be calculated between the centroid of a particular unclassified superpixel and the centroid of each of the foreground superpixels. The overall distance metric between the particular unclassified superpixel and the set of foreground superpixels may then be determined to be the shortest distance metric. An overall distance metric may be determined for a number of unclassified superpixels.
The superpixel ranking processor 305 can then arrange the determined overall distance metrics in an ascending rank in order to identify the unclassified superpixels with the smallest overall distance metrics. The unclassified superpixels with the smallest overall distance metric may be then classified as foreground seeds, thereby expanding the set of foreground seeds.
In a first group of embodiments the superpixel ranking processor 305 may determine a foreground distance metric D(x,f) between the centroid of a particular unclassified superpixel and the centroid of each foreground seed of the set of current foreground seeds fl. in turn, where x is the unassigned superpixel and f is a foreground seed. In other words for each superpixel x the foreground distance metric D(x,f) may be determined for each foreground seed within the current set of foreground seeds fly.
The step of determining a foreground distance metric (or measure) between the centroid of a particular unclassified superpixel and the centroid of each foreground seed in turn is shown as processing step 501 in Figure 5.
The foreground distance metric which yields the smallest distance between the particular unclassified superpixel x and each foreground seed may then be selected to be the overall foreground distance metric D(x) between the superpixel x and the current set of foreground seeds Lip. In other words the overall foreground distance metric D(x) may be expressed as DF(x)=minfEflDf(xJf) for alif (1) The step of determining the foreground distance metric by selecting the smallest distance between the particular unclassified superpixel and each foreground seed is shown as processing step 503 in Figure 5.
In embodiments the unclassified superpixel may then be assigned to the set of foreground seeds Q dependent on the value of the overall foreground distance metric DF(x).
In the first group of embodiments the foreground distance metric may be a combination of a spatial distance component given by the Euclidean distance d5(x,f) between the centroid of the unclassified superpixel x and the centroid of a foreground seed [and an appearance distance component given by the normalised distance 2 between the feature descriptor h(k) and for the unassigned superpixel x and the feature descriptor hf(k) for the foreground seed f.
The appearance distance component between the unassigned superpixel x and a foreground seed f may be given in the first group of embodiments as 2(h h -1VK lhXuc-hful 2 x' f) 2 L.k=1 hx(k)+hf(k) where K is the number of bins present in a feature descriptor. In other words the number of bins in the CIE L*a*b* histogram for each colour channel associated with a superpixel. For example, in the first group of embodiments the total number of bins in a feature descriptor is twenty in accordance with the number of bins used in processing step 403.
In the first group of embodiments the foreground distance metric may be expressed as D1(x,f) = d(x,f) +x2(h,h1) (3) The above processing steps 501 and 503 may be repeated for all superpixels which have not been assigned to either foreground or background seed sets. In Figure 5 this is depicted as the decision step 505, where it is checked whether the overall foreground distance metric DF(x) has been determined for each unclassified superpixel.
It is to be understood in embodiments that as a result of processing step 503 there will be an overall foreground distance metric D(x) for each unclassified superpixel of the digital image 302.
The overall foreground distance metric D(x) for each unclassified superpixel may then form the distance measure between each unclassified superpixel and the set of foreground seeds 1F The superpixel ranking processor 305 may then be configured to rank the foreground overall distance metric D(x) for all unassigned superpixels in an ascending order of magnitude.
The step of arranging all determined overall foreground distance metrics in ascending rank order of magnitude is shown as processing step 507 in Figure 5.
The superpixel ranking processor 305 may then be arranged to classify a number of the unclassified superpixels as foreground seeds according to their assembled relative rank order.
In embodiments, this may take the form of selecting those unclassified superpixels which have the highest rank order of overall foreground distance metrics. In other words, the superpixels which have the smallest overall foreground distance metric may be classified as foreground seeds by the superpixel ranking processor 305.
It is to be appreciated that the step of classifying a number of the unassigned superpixels as foreground seeds may be seen as expanding the set of foreground seeds Q11.
For instance, in the first group of embodiments the superpixel ranking processor 305 may select the superpixels whose foreground overall distance metric DF(x) lies within a specified percentile of the total number of ranked unclassified superpixels.
For example experiments have indicated that for some embodiments a percentile of 5% was found to produce an advantageous result. In other words in this particular example, the top 5% of superpixels in ascending rank order of the overall foreground distance metric may be classified as foreground seeds and therefore assigned to the current set of foreground seeds fl,7.
The step of classifying remaining unclassified superpixels as foreground seeds dependent on the rank order of foreground measure values DF(x) is shown as processing step 522 in Figure 5.
It is to be appreciated that the above processing steps have been described in terms of unclassified superpixels which do not belong to the current set of background seeds 12B* However, it is to be appreciated in other groups of embodiments that the above processing steps for expanding the set of foreground seeds may be performed by additionally encompassing superpixels which have been initially classified as background seeds by the superpixel selector 303.
The overall processing step of expanding the set of foreground seeds by classifying unclassified superpixels as foreground seeds is depicted as 407 in Figure 4.
As mentioned above the superpixel ranking processor 305 may also be arranged to additionally expand the set of background seeds 1B The process of classifying and assigning superpixels to the current set of background seeds may be dependent upon the combination of a distance metric between the centroid of the unclassified superpixel and the centroid of a previously classified foreground superpixel, and a distance metric between the centroid of the unclassified superpixel and the centroid of a previously classified background pixel. The superpixel ranking processor 305 may then be arranged to rank unassigned superpixels according to their value of the determined combined distance metric.
The superpixel ranking processor 305 may then be configured to classify a subset of the current unclassified superpixels as background seeds thereby expanding the set of superpixels classified as background seeds. The classification of unclassified (or unassigned) superpixels as background seeds may also take a similar form to the method adopted for the classification of the foreground seed. In other words the classification may be based upon the relative rank order of values distance metric values, whereby the superpixel ranking processor 305 selects a proportion of the highest ranked or closest unclassified superpixels to the set of
background seeds as background seeds.
In manner similar to above, the superpixel ranking processor 305 may also determine an overall background distance metric for each of the above remaining unclassified superpixels. In embodiments the overall background distance metric DBb(x) may also be determined using processing techniques similar as to the techniques used during the processing of steps 501 to 503 for the determination of the overall foreground distance measure Dp(x).
Accordingly, in a first group of embodiments the superpixel ranking processor 305 may first determine a background distance metric Db(x,b) between a particular unclassified superpixel and each background seed, where x is the unassigned superpixel and b is a background seed. In other words for each superpixel x the background distance metric Db(x,b) may be determined for each background seed in the set of background seeds [lB. Thus in a manner similar to that of equation 1 the overall background distance metric DBb(x) may be given as DBb(x) = minbEnBDb(x)b) for alib, (4) where the overall background distance metric DBb(x) is determined to be the background distance metric with the smallest distance value.
As before the distance metric Db(x,b) may be given by the combination of a spatial distance component given by the Euclidean distance d(x,b) between the centroid of the unclassified superpixel x and the centroid of a background seed b, and an appearance distance component given by the normalised distance between the feature descriptors h(k) and for the unassigned superpixel x and the feature descriptor hb(k) for the background seed b.
The appearance distance components between the unclassified superpixel x and a background seed b may be given in the first group of embodiments in a manner similar to that of equation 2.
Accordingly, in the first group of embodiments the background distance metric may also be expressed in manner similar to equation 3 as a combination of the normalised distance between the feature descriptors h(k) and hb(k), and the spatial distance component d5(x,b). Thus Db(x,b) may be expressed as Db(xJf) = d(x,b) +12(h%,hb). (5) The step of determining in turn the foreground distance metric Db(x,f) between a remaining unclassified superpixel and each background seed is shown as processing step 511 in Figure 5.
The step of determining the overall background distance metric by selecting the smallest distance between the particular remaining unclassified superpixel and each foreground seed is shown as processing step 513 in Figure 5.
Additionally, the superpixel ranking processor 305 may also re-determine the overall foreground distance metric D(x) for each remaining unclassified superpixel in relation to the updated set of foreground superpixels. This may be performed for all superpixels which have not been previously classified as either a foreground or background seeds, in other words the unclassified superpixels which remain unclassified after the processing of steps 501 to 509.
It is to be appreciated the superpixel ranking processor 305 may determine the overall foreground distance metric D(x) for each of the remaining unclassified superpixels by using the same techniques as given by processing steps 501 and 503, and equations 1 to 3. For the sake of clarity, the overall foreground distance metric for an unclassified superpixel x relating to the process of being classified as
a background seed may be denoted as Dpb(x).
The step of re-determining the overall foreground distance metric DFb(x)for each remaining unclassified superpixel by taking into account the updated set of foreground seeds as a result of processing step 509 is shown as processing step 515 in Figures.
The distance measure D8(x) between the unclassified superpixel x and the set of background seeds [18, may be expressed as a combination of the overall background distance metric relating to the existing set of background seeds and the overall foreground distance metric relating to the recently expanded (or updated) set of foreground seeds DEb.
In the first group of embodiments the combination of the overall background distance metric relating to the existing set of background seeds DBb and the overall foreground distance metric relating to the recently expanded (or updated) set of foreground seeds DFb may take the form of D(x) = (i -Dpb) + Dsb(x) (6) With reterence to Figure 5, the step of determining the background distance measure DB(x) between a remaining unclassified superpixel x and the set of background seeds 0B is shown as processing step 517.
The background distance measure DB(x) may be performed for all remaining unclassified superpixels of the digital image 302.
With reference to Figure 5 the act of performing the above processing steps for all remaining unclassified superpixels is shown as the decision step 519 where it is checked whether the background distance measure D has been determined tor all remaining unclassified superpixels. As depicted in Figure 5, if it is determined at processing step 519 that the background distance measureD8 has not been determined for all remaining superpixels then the return path 520 is taken.
Otherwise the process will proceed to the next processing step along the path 522.
The superpixel ranking processor 305 may then be arranged to rank the background distance measure D for all remaining unclassified superpixels according to their ascending order of magnitude.
The step or arranging for the background distance measures for all remaining superpixels to be assembled in ascending rank order of magnitude is shown as processing step 521 in Figure 5.
In embodiments the superpixel ranking processor 305 may then be configured to classify a number of the remaining unclassified superpixels as background seeds according to their assembled relative rank order.
The superpixel ranking processor 305 can then classify as background seeds those superpixels of the remaining unclassified superpixels which have the highest rank order. In other words, the superpixels which have the smallest background distance measure are classified by the superpixel ranking processor 305 as
background seeds.
Again, it is to be appreciated that the step of classifying a number of remaining unclassified superpixels as background seeds may be seen as expanding the set
of background seeds fl.
The superpixel ranking processor 305 may classify as background seeds the remaining unclassified superpixels which have a background distance measure D8(x) within a specified percentile of the total number of remaining unclassified superpixels.
For example experiments have shown that for some embodiments a percentile of 30% was found to produce an advantageous result. In other words in this particular example, the top 30% of remaining unclassified superpixels in ascending rank order of the background distance metric D8 are classified as background seeds and are therefore assigned to the current set of background seeds D. It is to be appreciated that other example embodiments may adopt a different value for the specified percentile.
The step of classifying remaining unclassified superpixels as background seeds dependent on the rank order of background measure values DB(x) is shown as processing step 522 in Figure 5.
The overall processing step of expanding the set of background seeds by classifying unclassified superpixels as background seeds is depicted as 409 in Figure 4.
It is to be appreciated in embodiments that as a result of performing the superpixel classification steps of Figure 5 there may be residual unclassified superpixels. As a consequence, some further embodiments may repeat the processing steps of Figure in order to further refine the respective sets of foreground and background seeds.
Furthermore the repeat of the superpixel classification steps as laid out in Figure 5 may be performed with a further pair of percentile threshold values, where said percentile threshold values may be more specifically "tuned" to said further repetitions.
It is to be appreciated that the term foreground in the above description may refer to any identifiable object of interest in the image, and may not necessary refer particularly to an object specifically in the foreground of the image.
It is also to be appreciated that the processing steps as described above and as depicted in Figure 5 may have the technical advantage of reducing the prior input in the form of user interactions when segmenting an objecting of interest from the rest of a digital image. In particular the above processing steps may exhibit the technical advantage of leveraging user interactions in order to generate new foreground and background seeds based on a sparse set of scribbles indicating the particular object of interest.
The output 308 of the superpixel ranking processor 305 may then comprise the expanded sets of foreground and background seeds, which may in turn form the output 32 of the seed expansion processor 30.
The expanded sets of foreground and background seeds may then go onto being used by the image segmentation processor 102 for subsequent image segmentation tasks.
It shall be appreciated that the term user equipment is intended to cover any suitable type of wireless user equipment, such as mobile telephones, portable data processing devices, portable web browsers, any combination thereof, and/or the like. Furthermore user equipment, universal serial bus (USB) sticks, and modem data cards may comprise apparatus such as the apparatus described in embodiments above.
In general, the various embodiments of the invention may be implemented in hardware or special purpose circuits, software, logic, any combination thereof, andlor the like. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto. While various aspects of the invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof, and/or the like.
The embodiments of this invention may be implemented by computer software executable by a data processor of the mobile device, such as in the processor entity, or by hardware, or by a combination of software and hardware. Further in this regard it should be noted that any blocks of the logic flow as in the Figures may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions. The software may be stored on such physical media as memory chips, or memory blocks implemented within the processor, magnetic media such as hard disk or floppy disks, and optical media such as for example DVD and the data variants thereof, CD.
The memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, any combination thereof, and/or the like. The data processors may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASIC), gate level circuits and processors based on multi-core processor architecture, any combination thereof, and/or the like.
Embodiments of the inventions may be practiced in various components such as integrated circuit modules. The design of integrated circuits is by and large a highly automated process. Complex and powerful software tools are available for converting a logic level design into a semiconductor circuit design ready to be etched and formed on a semiconductor substrate.
Programs, such as those provided by Synopsys, Inc. of Mountain View, California and Cadence Design, of San Jose, California automatically route conductors and locate components on a semiconductor chip using well established rules of design as well as libraries of pre-stored design modules. Once the design for a semiconductor circuit has been completed, the resultant design, in a standardized electronic format (e.g., Opus, GDSII, or the like) may be transmitted to a semiconductor fabrication facility or "fab" for fabrication.
The foregoing description has provided by way of exemplary and non-limiting examples a full and informative description of the exemplary embodiment of this invention. However, various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. However, all such and similar modifications of the teachings of this invention will still fall within the scope of this invention as defined in the appended claims.
As used in this application, the term circuitry may refer to all of the following: (a) hardware-only circuit implementations (such as implementations in only analogue and/or digital circuitry) and (b) to combinations of circuits and software (and/or firmware), such as and where applicable: (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memoryes) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and (c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software andlor firmware. The term circuitry would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in server, a cellular network device, or other network device.
The term processor and memory may comprise but are not limited to in this application: (1) one or more microprocessors, (2) one or more processor(s) with accompanying digital signal processor(s), (3) one or more processor(s) without accompanying digital signal processor(s), (3) one or more special-purpose computer chips, (4) one or more field-programmable gate arrays (FPGAS), (5) one or more controllers, (6) one or more application-specific integrated circuits (ASICS), or detector(s), processor(s) (including dual-core and multiple-core processors), digital signal processor(s), controller(s), receiver, transmitter, encoder, decoder, memory (and memories), software, firmware, RAM, ROM, display, user interface, display circuitry, user interface circuitry, user interface software, display software, circuit(s), antenna, antenna circuitry, and circuitry.

Claims (32)

  1. CLAIMS: 1. A method comprising: expanding a set of seeds associated with an object in a digital image, wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, wherein the set of seeds is expanded by: determining at least two distance measures, wherein each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; ranking the at least two distance measures in order of the magnitude of their values; and classifying at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
  2. 2. The method as claimed in claim 1, wherein ranking the at least two distance measures in order of the magnitude of their values comprises arranging the at least two distance measures in ascending order of magnitude.
  3. 3. The method as claimed in claims 1 and 2, wherein classifying the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures comprises: determining that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and selecting the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
  4. 4. The method as claimed in claims 1 to 3, wherein each of the at least two distance measures comprises a combination of a first distance metric and a second distance metric.
  5. 5. The method as claimed in claim 4, wherein the first distance metric represents the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
  6. 6. The method as claimed in Claim 5, wherein the spatial distance is a Euclidean distance.
  7. 7. The method as claimed in claims 4 to 6, wherein the second distance metric represents an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
  8. 8. The method as claimed in claim 7, wherein the superpixel comprises a plurality of pixels of the digital image, and the feature descriptor of a superpixel comprises a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
  9. 9. The method as claimed in claim 8, wherein the colour channel value is a pixel colour channel value given by a colour channel from the Commission internationale de l'eclairage L* a* b* colour space.
  10. 10. The method as claimed in claims 1 to 9, wherein the object in the digital image is a foreground object, and wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
  11. 11. The method as claimed in claim 10, further comprising repeating the method of claims 1 to 9, wherein the object in the digital image is a background object, wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the background object, and wherein each of the at least two distance measures is a combination of a distance measure between each of at least two superpixels of a further plurality of unclassified superpixels and the set of background seeds and a distance measure between each of the at least two superpixels of the further plurality of unclassified superpixels and the set of foreground seeds.
  12. 12. An apparatus comprising at least one processor and at least one memory including computer program code for one or more programs, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to: expand a set of seeds associated with an object in a digital image, wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, wherein the set of seeds is expanded by that apparatus being further caused to: determine at least two distance measures, wherein each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; rank the at least two distance measures in order of the magnitude of their values; and classify at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
  13. 13. The apparatus as claimed in claim 12, wherein the apparatus caused to rank the at least two distance measures in order of the magnitude oftheirvalues is further caused to arrange the at least two distance measures in ascending order of magnitude.
  14. 14. The apparatus as claimed in claims 12 and 13, wherein the apparatus caused to classify the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures is further caused to: determine that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and select the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
  15. 15. The apparatus as claimed in claims 12 to 14, wherein each of the at least two distance measures comprises a combination of a first distance metric and a second distance metric.
  16. 16. The apparatus as claimed in claim 15, wherein the first distance metric represents the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
  17. 17. The apparatus as claimed in claim 16, wherein the spatial distance is a Euclidean distance.
  18. 18. The apparatus as claimed in claims 15 to 17, wherein the second distance metric represents an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
  19. 19. The apparatus as claimed in claim 18, wherein the superpixel comprises a plurality of pixels of the digital image, and the feature descriptor of a superpixel comprises a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
  20. 20. The apparatus as claimed in claim 19, wherein the colour channel value is a pixel colour channel value given by a colour channel from the Commission internationale de l'éclairage L* a* V colour space.
  21. 21. The apparatus as claimed in claims 12 to 20, wherein the object in the digital image is a foreground object, and wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
  22. 22. An apparatus configured to: expand a set of seeds associated with an object in a digital image, wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the object, wherein the set of seeds is expanded by the apparatus being configured to: determine at least two distance measures, wherein each of the at least two distance measures is a distance between each of at least two superpixels of a plurality of unclassified superpixels of the digital image and the set of seeds; rank the at least two distance measures in order of the magnitude of their values; and classify at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures.
  23. 23. The apparatus as claimed in claim 22, wherein the apparatus configured to rank the at least two distance measures in order of the magnitude of theirvalues is further configured to arrange the at least two distance measures in ascending order of magnitude.
  24. 24. The method as claimed in claims 22 and 23, wherein the apparatus configured to classify the at least one of the at least two superpixels as a superpixel of the set of seeds according to the rank order of the at least two distance measures is further configured to: determine that at least one of the at least two distance measures are one of a specified number of highest rank order distance measures; and select the at least one of the at least two superpixels associated with the determined at least one of the at least two distance measures as a superpixel of the set of seeds.
  25. 25. The apparatus as claimed in claims 22 to 24, wherein each of the at least two distance measures comprises a combination of a first distance metric and a second distance metric.
  26. 26. The apparatus as claimed in claim 25, wherein the first distance metric represents the spatial distance between a centroid of a superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and the centroid of one of the plurality of superpixels of the set of seeds.
  27. 27. The apparatus as claimed in Claim 26, wherein the spatial distance is a Euclidean distance.
  28. 28. The apparatus as claimed in claims 25 to 27, wherein the second distance metric represents an appearance distance between a feature descriptor of the superpixel of one of the at least two superpixels of the plurality of unclassified superpixels and a feature descriptor of one of the plurality of superpixels of the set of seeds.
  29. 29. The apparatus as claimed in claim 28, wherein the superpixel comprises a plurality of pixels of the digital image, and the feature descriptor of a superpixel comprises a populated histogram of a colour channel value for the plurality of pixels of the superpixel.
  30. 30. The apparatus as claimed in claim 29, wherein the colour channel value is a pixel colour channel value given by a colour channel from the Commission internationale de l'éclairage L* a* b* colour space.
  31. 31. The apparatus as claimed in claims 22 to 30, wherein the object in the digital image is a foreground object, and wherein the set of seeds comprise a plurality of superpixels of the digital image classified as being associated with the foreground object.
  32. 32. A computer program product comprising at least one computer readable non-transitory medium having program code stored thereon, the program code, when executed by an apparatus, causing the apparatus at least to perform the method according to any of claims ito 10.
GB1317979.1A 2013-10-11 2013-10-11 A method and apparatus for image segmentation Withdrawn GB2519130A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1317979.1A GB2519130A (en) 2013-10-11 2013-10-11 A method and apparatus for image segmentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1317979.1A GB2519130A (en) 2013-10-11 2013-10-11 A method and apparatus for image segmentation

Publications (2)

Publication Number Publication Date
GB201317979D0 GB201317979D0 (en) 2013-11-27
GB2519130A true GB2519130A (en) 2015-04-15

Family

ID=49679892

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1317979.1A Withdrawn GB2519130A (en) 2013-10-11 2013-10-11 A method and apparatus for image segmentation

Country Status (1)

Country Link
GB (1) GB2519130A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030108237A1 (en) * 2001-12-06 2003-06-12 Nec Usa, Inc. Method of image segmentation for object-based image retrieval
US7295711B1 (en) * 2002-10-23 2007-11-13 Altera Corporation Method and apparatus for merging related image segments
US20080170787A1 (en) * 2007-01-12 2008-07-17 Arcsoft, Inc. Method for image separating
US20100066761A1 (en) * 2006-11-28 2010-03-18 Commissariat A L'energie Atomique Method of designating an object in an image
US20110038523A1 (en) * 2009-08-12 2011-02-17 Genetix Corporation Image segmentation
US7929755B1 (en) * 2005-07-11 2011-04-19 Adobe Systems Incorporated Planar map to process a raster image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030108237A1 (en) * 2001-12-06 2003-06-12 Nec Usa, Inc. Method of image segmentation for object-based image retrieval
US7295711B1 (en) * 2002-10-23 2007-11-13 Altera Corporation Method and apparatus for merging related image segments
US7929755B1 (en) * 2005-07-11 2011-04-19 Adobe Systems Incorporated Planar map to process a raster image
US20100066761A1 (en) * 2006-11-28 2010-03-18 Commissariat A L'energie Atomique Method of designating an object in an image
US20080170787A1 (en) * 2007-01-12 2008-07-17 Arcsoft, Inc. Method for image separating
US20110038523A1 (en) * 2009-08-12 2011-02-17 Genetix Corporation Image segmentation

Also Published As

Publication number Publication date
GB201317979D0 (en) 2013-11-27

Similar Documents

Publication Publication Date Title
US20190347767A1 (en) Image processing method and device
US9697416B2 (en) Object detection using cascaded convolutional neural networks
US9665962B2 (en) Image distractor detection and processng
US9934439B1 (en) Method, system for removing background of a video, and a computer-readable storage device
WO2014174932A1 (en) Image processing device, program, and image processing method
US9704258B2 (en) Image segmentation device, image segmentation method, and depth map generating method
US9495755B2 (en) Apparatus, a method and a computer program for image processing
CN106488215B (en) Image processing method and apparatus
US9639943B1 (en) Scanning of a handheld object for 3-dimensional reconstruction
CN111062854B (en) Method, device, terminal and storage medium for detecting watermark
WO2012074361A1 (en) Method of image segmentation using intensity and depth information
CN110032964B (en) Image processing method, device, equipment and storage medium for identifying visual angle
CN109977952B (en) Candidate target detection method based on local maximum
US9753625B2 (en) Image selection control
US11568631B2 (en) Method, system, and non-transitory computer readable record medium for extracting and providing text color and background color in image
CN110689020A (en) Segmentation method of mineral flotation froth image and electronic equipment
KR101836811B1 (en) Method, apparatus and computer program for matching between the images
KR20180067909A (en) Apparatus and method for segmenting image
KR20190059083A (en) Apparatus and method for recognition marine situation based image division
CN112102929A (en) Medical image labeling method and device, storage medium and electronic equipment
KR101833943B1 (en) Method and system for extracting and searching highlight image
CN110969641A (en) Image processing method and device
US20150235399A1 (en) Variable Patch Shape Synthesis
TWI671686B (en) Image data retrieving method and image data retrieving device
CN105141974A (en) Video clipping method and device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)