WO2015079282A1 - Method for determining local differentiating color for image feature detectors - Google Patents
Method for determining local differentiating color for image feature detectors Download PDFInfo
- Publication number
- WO2015079282A1 WO2015079282A1 PCT/IB2013/003104 IB2013003104W WO2015079282A1 WO 2015079282 A1 WO2015079282 A1 WO 2015079282A1 IB 2013003104 W IB2013003104 W IB 2013003104W WO 2015079282 A1 WO2015079282 A1 WO 2015079282A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- channel
- response
- computing device
- filter
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 36
- 230000004044 response Effects 0.000 claims abstract description 217
- 239000013598 vector Substances 0.000 claims abstract description 140
- 238000001514 detection method Methods 0.000 claims abstract description 33
- 239000011159 matrix material Substances 0.000 claims description 64
- 238000001914 filtration Methods 0.000 claims description 9
- 230000017105 transposition Effects 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 abstract description 7
- 238000004891 communication Methods 0.000 description 19
- 230000002093 peripheral effect Effects 0.000 description 7
- 238000010191 image analysis Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000013500 data storage Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005316 response function Methods 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/469—Contour-based spatial representations, e.g. vector-coding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Definitions
- Computer vision utilizes a variety of image feature detectors to identify features of the image or "points of interest" within the image.
- Image features detectors may identify edges, corners, blobs (i.e., regions of interest points), and/or ridges of an analyzed image, depending on the particular algorithm/detector.
- Canny algorithms and Sobel filters perform edge detection; Harris detectors perform corner detection; and Laplacian of Gausian (LoG), Hessian of Gaussian determinants, and Difference of Gaussian (DoG) detectors identify corners and blobs within an image.
- Feature detection systems oftentimes utilize a combination of algorithms and detectors to more accurately identify features of an analyzed image.
- Scale-Invariant Feature Transform SIFT
- Canny Canny
- Harris and Sobel detect and describe features of single-channel images (i.e., grayscale images).
- multi-channel images i.e., colored images
- the image pixel values of the single-channel grayscale image may be generated as a linear combination of corresponding pixel values of each of the channels of the multi-channel image.
- the contrast between multi-channel image pixels having distinct colors but the same single-channel grayscale representation is lost due to the grayscale transformation.
- FIG. 1 is a simplified block diagram of at least one embodiment of a computing device for performing multi-channel feature detection
- FIG. 2 is a simplified block diagram of at least one embodiment of an environment of the computing device of FIG. 1 ;
- FIG. 3 is a simplified flow diagram of at least one embodiment of a method for performing multi-channel feature detection on the computing device of FIG. 1;
- FIG. 4 is a simplified flow diagram of at least one embodiment of a method for deterrmning a local differentiating color vector on the computing device of FIG. 1;
- FIGS. 5 and 6 are diagrams of a captured image and its identified interest points, respectively, based on the method for multi-channel feature detection of FIG. 3 and a SURF feature detector as an inner kernel.
- references in the specification to "one embodiment,” “an embodiment,” “an illustrative embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include that particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- items included in a list in the form of "at least one A, B, and C” can mean (A); (B); (C): (A and B); (B and C); or (A, B, and C).
- items listed in the form of "at least one of A, B, or C” can mean (A); (B); (C): (A and B); (B and C); or (A, B, and C).
- the disclosed embodiments may be implemented, in some cases, in hardware, firmware, software, or any combination thereof.
- the disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage medium, which may be read and executed by one or more processors.
- a machine-readable storage medium may be embodied as any storage device, mechanism, or other physical structure for storing or transmitting information in a form readable by a machine (e.g., a volatile or non-volatile memory, a media disc, or other media device).
- a computing device 100 for multi-channel feature detection is configured to detect features (e.g., interest points such as corners, edges, blobs, etc.) of a multi-channel image. To do so, the computing device 100 utilizes information from multiple image channels in identifying image features rather than a single channel or grayscale image (e.g., a post-transform image).
- the computing device 100 is configured to implement a low-complexity non-iterative algorithm for computing a local differentiating color (LDC) vector in which the response function of the inner kernel can be represented as a linear or quadratic form function.
- LDC local differentiating color
- the second order spatial derivative filter responses D a , D , and D may be calculated, where x and y are spatial coordinates of the image.
- the response of a LoG inner kernel may be expressed in a linear form, (D ⁇ + D yy ) .
- the response of a SURF inner kernel may be expressed as a quadratic form,
- the response of an original Harris inner kernel may be expressed as a quadratic form
- & is an algorithmic parameter. Additionally, a square response of a Canny inner
- D are first order spatial derivative filter responses, again where x and y are spatial coordinates of the image.
- the computing device 100 may be embodied as any type of computing device capable of multi-channel feature detection and performing the functions described herein.
- the computing device 100 may be embodied as a cellular phone, smartphone, tablet computer, netbook, notebook, ultrabookTM, laptop computer, personal digital assistant, mobile Internet device, desktop computer, Hybrid device, and/or any other computing/communication device.
- the illustrative computing device 100 includes a processor 110, an input/output ("I/O") subsystem 112, a memory 114, a data storage 116, a communication circuitry 118, and one or more peripheral devices 120.
- the peripheral devices 120 include a camera 122 and a display 124.
- the computing device 100 may include other or additional components, such as those commonly found in a typical computing device (e.g., various input/output devices), in other embodiments. Additionally, in some embodiments, one or more of the illustrative components may be incorporated in, or otherwise from a portion of, another component. For example, the memory 114, or portions thereof, may be incorporated in the processor 110 in some embodiments.
- the processor 110 may be embodied as any type of processor capable of performing the functions described herein.
- the processor may be embodied as a single or multi-core processor(s), digital signal processor, microcontroller, or other processor or processing/controlling circuit.
- the memory 114 may be embodied as any type of volatile or non- volatile memory or data storage capable of performing the functions described herein. In operation, the memory 114 may store various data and software used during operation of the computing device 100 such as operating systems, applications, programs, libraries, and drivers.
- the memory 1 14 is communicatively coupled to the processor 110 via the I/O subsystem 112, which may be embodied as circuitry and/or components to facilitate input/output operations with the processor 110, the memory 114, and other components of the computing device 100.
- the I/O subsystem 112 may be embodied as, or otherwise include, memory controller hubs, input/output control hubs, firmware devices, communication links (i.e., point-to-point links, bus links, wires, cables, light guides, printed circuit board traces, etc.) and/or other components and subsystems to facilitate the input/output operations.
- the I/O subsystem 112 may form a portion of a system-on-a-chip (SoC) and be incorporated, along with the processor 110, the memory 1 14, and other components of the computing device 100, on a single integrated circuit chip.
- SoC system-on-a-chip
- the data storage 116 may be embodied as any type of device or devices configured for short-term or long-term storage of data such as, for example, memory devices and circuits, memory cards, hard disk drives, solid-state drives, or other data storage devices.
- the communication circuitry 118 may be embodied as any communication circuit, device, or collection thereof, capable of enabling communications between the computing device 100 and other remote devices over a network (not shown).
- the communication circuitry 118 may use any suitable communication technology (e.g., wireless or wired communications) and associated protocol (e.g., Ethernet, Bluetooth ® , Wi-Fi ® , WiMAX, etc.) to effect such communication depending on, for example, the type of network, which may be embodied as any type of communication network capable of facilitating communication between the computing device 100 and remote devices.
- suitable communication technology e.g., wireless or wired communications
- associated protocol e.g., Ethernet, Bluetooth ® , Wi-Fi ® , WiMAX, etc.
- the peripheral devices 120 of the computing device 100 may include any number of additional peripheral or interface devices. The particular devices included in the peripheral devices 120 may depend on, for example, the type and/or intended use of the computing device 100.
- the peripheral devices 120 include a camera 122 and a display 124.
- the camera 122 may be embodied as any peripheral or integrated device suitable for capturing images, such as a still camera, a video camera, a webcam, or other device capable of capturing video and/or images.
- the camera 122 may be used, for example, to capture multi-channel images in which features are detected.
- the display 124 of the computing device 100 may be embodied as any one or more display screens on which information may be displayed to a viewer of the computing device 100.
- the display 124 may be embodied as, or otherwise use, any suitable display technology including, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, a cathode ray tube (CRT) display, a plasma display, and/or other display technology.
- the display 124 may be used, for example, to display an image indicative of the total response of an analyzed image.
- FIG. 1 shows that the camera 122 and/or display 124 may be remote from the computing device 100 but communicatively coupled thereto in other embodiments.
- the computing device 100 establishes an environment 200 for multi-channel feature detection. As discussed below, the computing device 100 determines a total image response of an analyzed multi-channel image based on a local differentiating color (LDC) vector and filter responses of the individual image channels of the multi-channel image.
- the illustrative environment 200 of the computing device 100 includes an image capturing module 202, an image analysis module 204, a display module 206, and a communication module 208. Additionally, the image analysis module 204 includes an image filtering module 210, a local differentiating color module 212, and a response determination module 214.
- Each of the image capturing module 202, the image analysis module 204, the display module 206, the communication module 208, the image filtering module 210, the local differentiating color module 212, and the response determination module 214 may be embodied as hardware, software, firmware, or a combination thereof. Additionally, in some embodiments, one of the illustrative modules may form a portion of another module.
- the image capturing module 202 controls the camera 122 to capture images within the field of view of the camera 122 (e.g., for multi-channel feature detection). Depending on the particular embodiment, the images may be captured as streamed video or as individual images/frames. In other embodiments, the image capturing module 202 may otherwise retrieve a multi-channel image for analysis and feature detection. For example, the multi-channel image may be received from a remote computing device (e.g., in a cloud computing environment) with the communication module 208. It should be appreciated that the captured image may be embodied as any suitable multi-channel image.
- the image may be a three-channel image such as an RGB (red-green-blue), HSL (hue-saturation- lightness), or HSV (hue-saturation-value) image.
- RGB red-green-blue
- HSL high-saturation- lightness
- HSV high-saturation-value
- the multi-channel image feature detection described herein may be applied to any type of image channels including channels for non-color spaces (e.g., RGB-D (depth), infrared, temperature map, microwave map, or other image channels).
- the image analysis module 204 retrieves the images captured with the camera
- the image analysis module 204 establishes coordinates and parameters for image extended space points (e.g., for a scale-space representation and/or use with scale-space detectors). Further, as discussed in more detail below, the image analysis module 204 applies various filters to the analyzed image, determines an LDC vector for each image point (or a subset thereof) the image, and determines the total response of each image point of the image (or a subset thereof).
- the image filtering module 210 determines a filter response (i.e., the result of applying an image filter to the image) of each image channel of the multi-channel image for one or more image filters.
- the image filters may be applied to each pixel of the image in some embodiments.
- the image filters may be applied using, for example, a "windowing" method in which the image filter is applied to a neighborhood (e.g., of the size of the image filter kernel) of the pixel.
- the image filters are generally applied to the individual pixels of an image channel, the image filters may be described herein as being applied to an image channel or other structure as a whole rather than the values of individual pixels for simplicity and clarity of the description.
- the image filtering module 210 applies each image filter to each of the three channels to generate a corresponding filter response based on that filter.
- the filter responses for a particular image channel of the multi-channel image may be represented as a vector including the corresponding responses of the image channel to the one or more image filters. Additionally, such vectors may be referred to as "response vectors" or “vector responses" of the corresponding image channels.
- the particular image filters employed must be linear or quadratic form image filters.
- the LDC vector may be applied to the pixels of the original image channels without any previous filtering or with only trivial/identity filters.
- the local differentiating color module 212 determines a local differentiating color vector based on the filter responses determined by the image filtering module 210. As discussed in detail below, the local differentiating color vector is calculated or determined as a vector that defines weights for a linear combination of filter responses for the image channels and that produces an extreme (i.e., minimum or maximum depending on the particular embodiment) total response. For linear form, the local differentiating color module 212 determines the local differentiating color vector to be a vector that is collinear with a vector of total responses determined for each image channel.
- the local differentiating color vector is determined as an eigenvector (or normalized eigenvector) corresponding to an extreme eigenvalue of a specific generated symmetric matrix (i.e., the largest or smallest eigenvalue depending on the particular embodiment).
- the local differentiating color vector may be expressed in closed form rather than being calculated as a result of an optimization algorithm (e.g., minimizing or maximizing a cost function).
- the response determination module 214 applies the local differentiating color vector to the image filter responses generated by the image filtering module 210 to generate an adapted response and determines a total response of the multi-channel image based on the adapted response.
- the response determination module 214 applies the local differentiating color vector to the image filter responses by separately calculating the dot product of the local differentiating color vector and the response vector of each image channel of the multi-channel image. Additionally, as discussed in more detail below, the response determination module 214 determines the total response of the multichannel image by generating a scalar value based on the adapted response and parameters of the particular filters and/or feature detection algorithms employed.
- the response determination module 214 also suppresses spatial non-extreme responses of the total response of the multi-channel image. That is, in some embodiments, the response determination module 214 removes non-interest points from the total response, which may be identified based on a pre-defined threshold value. In other words, interest points may be identified as image points having a local extreme response above or below the threshold value depending on the particular embodiment. As such, only interest points remain in the total response.
- the display module 206 is configured to render images on the display 124 for the user of the computing device 100 to view.
- the display module 206 may display one or more captured/received images (see FIG. 5) and/or images indicative of the total response of the images (see FIG. 6).
- the display module 206 may render a visual depiction of the image at another stage of the feature detection process.
- the display module 206 may render a graphical and/or textual depiction of the individual filter responses, the local differentiating color vector, the adapted response, and/or the total response prior to suppression of non-extreme responses.
- the communication module 208 handles the communication between the computing device 100 and remote devices through a network. As discussed above, the communication module 208 may receive multi-channel images from a remote computing device for analysis (e.g., in a cloud computing environment or for offloaded execution). As such, in some embodiments, the communication module 208 may also transmit the result (e.g., the total response) of the feature detection analysis to a remote computing device.
- the communication module 208 may also transmit the result (e.g., the total response) of the feature detection analysis to a remote computing device.
- the computing device 100 may execute a method 300 for performing multi-channel feature detection.
- the illustrative method 300 begins with block 302 of FIG. 3 in which the computing device 100 determines whether to perform multi-channel feature detection. If the computing device 100 determines to perform multi-channel feature detection, the computing device 100 establishes a coordinate system for image extended space points in block 304. In other words, the computing device 100 establishes, for example, a Cartesian coordinate system (e.g., commonly used x- and y- axes) and additional parameters (e.g., scale) for use with scale-space image feature detectors.
- Cartesian coordinate system e.g., commonly used x- and y- axes
- additional parameters e.g., scale
- the computing device 100 determines the filter responses of each image channel based on one or more image filters (e.g., Hessian determinant, Canny, Sobel filter, etc.). In doing so, in block 308, the computing device 100 generates a response vector for each image channel based on the filter responses as discussed above (i.e., by applying the image filters to the individual image channels). For example, suppose the analyzed multichannel image is a three-channel RGB (red-green-blue) image and partial second derivatives of a Gaussian filter (i.e., components of the Hessian matrix) are employed as image filters.
- image filters e.g., Hessian determinant, Canny, Sobel filter, etc.
- the image filters include g ⁇ , g ⁇ , and g ⁇ , which are partial second derivatives with respect to the corresponding image dimensions.
- each of the image filters i.e., each of g ⁇ , g ⁇ , and g ⁇
- the image filters may be applied to each pixel of the image. Accordingly, a response vector may be generated for each pixel of the image channel.
- each of the image filters is applied to the blue image channel and to the green image channel such that a response vector is generated for each of the channels.
- Each response vector can be reduced to a scalar value.
- the Hessian determinant can be determined by the quadratic form
- [Sxx gyy , and g w are second order partial derivatives of a Gaussian filter taken with respect to the corresponding spatial coordinates x and/or y.
- other embodiments may utilize a different number of image filters and/or analyze images having a different number of channels.
- n response vectors i.e., one for each channel
- size/length p or, more specifically, size p x 1
- the computing device 100 determines the local differentiating color vector based on the filter responses of each image channel (e.g., a normalized LDC vector). In other words, the computing device 100 utilizes the response vectors for the image channels to generate the local differentiating color vector. To do so, the computing device 100 may execute a method 400 for determming a local differentiating color vector as shown in FIG. 4. The illustrative method 400 begins with block 402 in which the computing device 100 determines whether to generate the local differentiating color vector. If so, the computing device 100 generates or otherwise determines a symmetric form matrix, A, for the image in block 404.
- A symmetric form matrix
- the computing device 102 may generate a symmetric form of a quadratic form matrix using any suitable techniques, algorithms, and/or mechanisms.
- q y represents the element of the A matrix positioned at the z* row and 7 th column
- / is a response vector for an image channel corresponding with an index of f
- T is a transposition operator (i.e., f T is the transpose of /)
- B is a predefined matrix based on the one or more image filters.
- the B matrix may be defined as:
- the B matrix may be calculated based on image and/or filter parameters.
- the computing device 100 calculates, in block 408, the matrix A for a three-channel image (e.g., an RGB image) as:
- the analyzed image may include fewer or greater number of channels and, in such embodiments, the matrix A is sized accordingly (e.g., a 4x4 matrix in embodiments in which the analyzed image includes four channels, etc.).
- the matrix A is embodied as an n x n matrix, where n is the number of image channels.
- the computing device 100 determines the eigenvalues of the A matrix. It should be appreciated that the computing device 100 may utilize any suitable techniques, algorithms, or mechanisms for doing so. For example, the computing device 100 may determine and utilize the characteristic equation of the A matrix in identifying its eigenvalues. In block 412, the computing device 100 identifies the eigenvector corresponding with an extreme eigenvalue of the A matrix (i.e., the largest or smallest eigenvalue depending on the particular embodiment) and, in block 414, the computing device 100 selects the identified eigenvector as the local differentiating color vector. In doing so, the computing device 100 may generate a unit vector for the identified eigenvector in block 416 in some embodiments.
- an extreme eigenvalue of the A matrix i.e., the largest or smallest eigenvalue depending on the particular embodiment
- the computing device 100 selects the identified eigenvector as the local differentiating color vector. In doing so, the computing device 100 may generate a unit vector for the identified
- the computing device 100 may normalize the eigenvector to generate a unit vector corresponding with the eigenvector, which may be selected as the local differentiating color vector.
- the computing device 100 applies the image filter responses to the generated/determined local differentiating color vector to generate a corresponding adapted response in block 312. In doing so, the computing device 100 calculates the dot product of the image filter responses and the local differentiating color vector in block 314 (e.g., to generate a single vector).
- the dot product of the local differentiating color vector and a vector including the partial second derivatives is calculated for all channels of the image, which is equivalent with the transpose of the vector being multiplied by the local differentiating color vector. Specifically, [g ⁇ g ] T is multiplied by the local differentiating color vector for all channels of the image.
- the computing device 100 generates a total response based on the adapted response. In the illustrative embodiment, the computing device 100 generates a scalar value based on the adapted response and the particular feature detection algorithms/filters used.
- the computing device 100 may utilize parameters and/or characteristics of the Hessian matrix to generate the total response (e.g., using the Hessian determinant). It should be appreciated that the computing device 100 may utilize any suitable techniques, algorithms, and/or mechanisms for doing so.
- the computing device 100 suppresses spatial non-extreme responses of the total response in extended space. That is, in block 320, the computing device 100 may remove non-interest points from the total response.
- interest points and non-interest points may be differentiated based on a pre-defined threshold value. For example, in one embodiment, the spatial image points of the total response having a local extreme response or an intensity value exceeding the pre-defined threshold value are considered to be "points of interest" or "interest points,” whereas the spatial image points of the total response having local extreme responses or intensity values not exceeding the predefined threshold value are non-interest points.
- the computing device 100 may utilize any suitable feature detection algorithm having a quadratic form response function (e.g., SURF) and may identify "points of interest" in any suitable way. As indicated above, depending on the particular algorithm, points of interest may include corners, edges, blobs, and/or other images characteristics. Further, in some embodiments, the generation of the total response in block 316 includes the suppression of spatial non-extreme responses.
- SURF quadratic form response function
- the computing device 100 may generate and display an image indicative of the total response of an analyzed multi-channel image for the user to view.
- a simplified analyzed image 500 is shown in FIG. 5, and a simplified example output image 600, which is illustratively generated based on a multi-channel feature detection (with a SURF inner kernel) of the image 500, is shown in FIG. 6.
- the identified interest points/features are shown as differently shaded circles to connote circles of corresponding different colors.
- the image 600 is a simplified version of a real-world output image that would be generated using the technologies disclosed herein, and such real-world output image may identify points/features of interest using a greater or fewer number of circles having a larger range of different colors and sizes depending on, for example, the original analyzed image. Additionally, it should be appreciated that, unlike single-channel grayscale feature detection, the feature detection performed by the computing device 100 on an analyzed image to generate an output image as described herein does not suffer from the information loss inherent in a grayscale transformation.
- An embodiment of the technologies may include any one or more, and any combination of, the examples described below.
- Example 1 includes a computing device for multi-channel feature detection, the computing device comprising an image filtering module to determine a filter response of each image channel of a multi-channel image for one or more image filters; a local differentiating color module to determine a local differentiating color vector based on the filter responses; and a response determination module to (i) apply the filter responses to the local differentiating color vector to generate an adapted response and (ii) determine a total response of the multi-channel image based on the adapted response.
- an image filtering module to determine a filter response of each image channel of a multi-channel image for one or more image filters
- a local differentiating color module to determine a local differentiating color vector based on the filter responses
- a response determination module to (i) apply the filter responses to the local differentiating color vector to generate an adapted response and (ii) determine a total response of the multi-channel image based on the adapted response.
- Example 2 includes the subject matter of Example 1, and wherein the one or more filters consists of an identity filter.
- Example 3 includes the subject matter of any of Examples 1 and 2, and wherein to determine the local differentiating color vector comprises to determine a vector that is collinear with a vector of total responses determined for each image channel based on the filter response of each image channel.
- Example 4 includes the subject matter of any of Examples 1-3, and wherein to determine the local differentiating color vector comprises to determine a symmetric form matrix for the multi-channel image; and identify an eigenvector corresponding with a smallest-valued eigenvalue or largest-valued eigenvalue of the symmetric form matrix.
- Example 5 includes the subject matter of any of Examples 1-4, and wherein to determine the local differentiating color vector comprises to determine a symmetric form of a quadratic form matrix for the multi-channel image; and identify an eigenvector corresponding with a smallest-valued eigenvalue or a largest-valued eigenvalue of the quadratic form matrix.
- Example 7 includes the subject matter of any of Examples 1-6, and wherein to determine the local differentiating color vector comprises to normalize the identified eigenvector to generate the local differentiating color vector.
- Example 8 includes the subject matter of any of Examples 1-7, and wherein to determine the filter response of each image channel of the multi-channel image comprises to determine a filter response of each image channel of a multi-channel image for each pixel of the multi-channel image.
- Example 9 includes the subject matter of any of Examples 1-8, and wherein to determine the filter response of each image channel of the multi-channel image comprises to generate a response vector for each image channel based on the filter response of each image channel.
- Example 10 includes the subject matter of any of Examples 1-9, and wherein to apply the filter responses to the local differentiating color vector comprises to calculate a dot product of the local differentiating color vector and the filter responses.
- Example 11 includes the subject matter of any of Examples 1-10, and wherein to determine the local differentiating color vector comprises to determine a normalized local differentiating color vector based on the filter responses.
- Example 12 includes the subject matter of any of Examples 1-11, and wherein the response determination module is further to suppress spatial non-extreme responses of the total response of the multi-channel image.
- Example 13 includes the subject matter of any of Examples 1-12, and wherein to suppress the spatial non-extreme responses comprises to remove non-interest points from the total response of the multi-channel image, wherein the non-interest points are identified based on a pre-defined threshold value.
- Example 14 includes the subject matter of any of Examples 1-13, and further including a display module to display an image indicative of the total response on a display of the computing device.
- Example 15 includes the subject matter of any of Examples 1-14, and further including an image capturing module to capture a captured image with a camera of the computing device, wherein the multi-channel image is the captured image.
- Example 16 includes the subject matter of any of Examples 1-15, and wherein the one or more image filters comprise one or more of a first order derivative image filter or a second order derivative image filter.
- Example 17 includes a method for performing multi-channel feature detection on a computing device, the method comprising determining, by the computing device, a filter response of each image channel of a multi-channel image for one or more image filters; determining, by the computing device, a local differentiating color vector based on the filter responses; applying, by the computing device, the filter responses to the local differentiating color vector to generate an adapted response; and determining, by the computing device, a total response of the multi-channel image based on the adapted response.
- Example 18 includes the subject matter of Example 17, and wherein the one or more filters consists of an identity filter.
- Example 19 includes the subject matter of any of Examples 17 and 18, and wherein determining the local differentiating color vector comprises determining a vector that is collinear with a vector of total responses determined for each image channel based on the filter response of each image channel.
- Example 20 includes the subject matter of any of Examples 17-19, and wherein determining the local differentiating color vector comprises determining a symmetric form matrix for the multi-channel image; and identifying an eigenvector corresponding with a smallest-valued eigenvalue or largest-valued eigenvalue of the symmetric form matrix.
- Example 21 includes the subject matter of any of Examples 17-20, and wherein determining the local differentiating color vector comprises determining a symmetric form of a quadratic form matrix for the multi-channel image; and identifying an eigenvector corresponding with a smallest-valued eigenvalue or a largest-valued eigenvalue of the quadratic form matrix.
- Example 23 includes the subject matter of any of Examples 17-22, and wherein detennining the local differentiating color vector comprises normalizing the identified eigenvector to generate the local differentiating color vector.
- Example 24 includes the subject matter of any of Examples 17-23, and wherein determining the filter response of each image channel of the multi-channel image comprises determining a filter response of each image channel of a multi-channel image for each pixel of the multi-channel image.
- Example 25 includes the subject matter of any of Examples 17-24, and wherein determining the filter response of each image channel of the multi-channel image comprises generating a response vector for each image channel based on the filter response of each image channel.
- Example 26 includes the subject matter of any of Examples 17-25, and wherein applying the filter responses to the local differentiating color vector comprises calculating a dot product of the local differentiating color vector and the filter responses.
- Example 27 includes the subject matter of any of Examples 17-26, and wherein determining the local differentiating color vector comprises determining a normalized local differentiating color vector based on the filter responses.
- Example 28 includes the subject matter of any of Examples 17-27, and further including suppressing, by the computing device, spatial non-extreme responses of the total response of the multi-channel image.
- Example 29 includes the subject matter of any of Examples 17-28, and wherein suppressing the spatial non-extreme responses comprises removing non-interest points from the total response of the multi-channel image, wherein the non-interest points are identified based on a pre-defined threshold value.
- Example 30 includes the subject matter of any of Examples 17-29, and further including displaying, on a display of the computing device, an image indicative of the total response.
- Example 31 includes the subject matter of any of Examples 17-30, and further including capturing, by a camera of the computing device, a captured image, wherein the multi-channel image is the captured image.
- Example 32 includes the subject matter of any of Examples 17-31, and wherein the one or more image filters comprise one or more of a first order derivative image filter or a second order derivative image filter.
- Example 33 includes a computing device comprising a processor; and a memory having stored therein a plurality of instructions that when executed by the processor cause the computing device to perform the method of any of Examples 17-32.
- Example 34 includes one or more machine-readable storage media comprising a plurality of instructions stored thereon that, in response to being executed, result in a computing device performing the method of any of Examples 17-32.
- Example 35 includes a computing device for multi-channel feature detection, the computing device comprising means for determining a filter response of each image channel of a multi-channel image for one or more image filters; means for determining a local differentiating color vector based on the filter responses; means for applying the filter responses to the local differentiating color vector to generate an adapted response; and means for determining a total response of the multi-channel image based on the adapted response.
- Example 36 includes the subject matter of Example 35, and wherein the one or more filters consists of an identity filter.
- Example 37 includes the subject matter of any of Examples 35 and 36, and wherein the means for determining the local differentiating color vector comprises means for determining a vector that is collinear with a vector of total responses determined for each image channel based on the filter response of each image channel.
- Example 38 includes the subject matter of any of Examples 35-37, and wherein the means for determining the local differentiating color vector comprises means for determining a symmetric form matrix for the multi-channel image; and means for identifying an eigenvector corresponding with a smallest-valued eigenvalue or largest-valued eigenvalue of the symmetric form matrix.
- Example 39 includes the subject matter of any of Examples 35-38, and wherein the means for determining the local differentiating color vector comprises means for determining a symmetric form of a quadratic form matrix for the multi-channel image; and means for identifying an eigenvector corresponding with a smallest-valued eigenvalue or a largest-valued eigenvalue of the quadratic form matrix.
- Example 41 includes the subject matter of any of Examples 35-40, and wherein the means for determining the local differentiating color vector comprises means for normalizing the identified eigenvector to generate the local differentiating color vector.
- Example 42 includes the subject matter of any of Examples 35-41, and wherein the means for determining the filter response of each image channel of the multichannel image comprises means for determining a filter response of each image channel of a multi-channel image for each pixel of the multi-channel image.
- Example 43 includes the subject matter of any of Examples 35-42, and wherein the means for determining the filter response of each image channel of the multichannel image comprises means for generating a response vector for each image channel based on the filter response of each image channel.
- Example 44 includes the subject matter of any of Examples 35-43, and wherein the means for applying the filter responses to the local differentiating color vector comprises means for calculating a dot product of the local differentiating color vector and the filter responses.
- Example 45 includes the subject matter of any of Examples 35-44, and wherein the means for determining the local differentiating color vector comprises means for determining a normalized local differentiating color vector based on the filter responses.
- Example 46 includes the subject matter of any of Examples 35-45, and further including means for suppressing spatial non-extreme responses of the total response of the multi-channel image.
- Example 47 includes the subject matter of any of Examples 35-46, and wherein the means for suppressing the spatial non-extreme responses comprises means for removing non-interest points from the total response of the multi-channel image, wherein the non-interest points are identified based on a pre-defined threshold value.
- Example 48 includes the subject matter of any of Examples 35-47, and further including means for displaying, on a display of the computing device, an image indicative of the total response.
- Example 49 includes the subject matter of any of Examples 35-48, and further including means for capturing, by a camera of the computing device, a captured image, wherein the multi-channel image is the captured image.
- Example 50 includes the subject matter of any of Examples 35-49, and wherein the one or more image filters comprise one or more of a first order derivative image filter or a second order derivative image filter.
- Example 51 includes a computing device for multi-channel feature detection, the computing device comprising a local differentiating color module to determine a local differentiating color vector based on pixel values of each image channel of a multi-channel image; and a response determination module to (i) apply the pixel values of each image channel to the local differentiating color vector to generate an adapted response and (ii) determine a total response of the multi-channel image based on the adapted response.
- Example 52 includes the subject matter of Example 51, and wherein to determine the local differentiating color vector comprises to determine a vector that is collinear with a vector of total responses determined for each image channel based on the pixel values of each image channel.
- Example 53 includes the subject matter of any of Example 51 and 52, and wherein to determine the local differentiating color vector comprises to determine a symmetric form matrix for the multi-channel image; and identify an eigenvector corresponding with a smallest-valued eigenvalue or largest-valued eigenvalue of the symmetric form matrix.
- Example 54 includes the subject matter of any of Example 51-53, and wherein to determine the local differentiating color vector comprises to normalize the identified eigenvector to generate the local differentiating color vector.
- Example 55 includes the subject matter of any of Example 51-54, and wherein the response determination module is further to suppress spatial non-extreme responses of the total response of the multi-channel image.
- Example 56 includes a method for performing multi-channel feature detection on a computing device, the method comprising determining, by the computing device, a local differentiating color vector based on pixel values of each image channel of a multi-channel image; applying, by the computing device, the pixel values of each image channel to the local differentiating color vector to generate an adapted response; and deteraiining, by the computing device, a total response of the multi-channel image based on the adapted response.
- Example 57 the subject matter of Example 56, and wherein detennining the local differentiating color vector comprises determining a vector that is collinear with a vector of total responses determined for each image channel based on the pixel values of each image channel.
- Example 58 the subject matter of any of Examples 56 and 57, and wherein determining the local differentiating color vector comprises determining a symmetric form matrix for the multi-channel image; and identifying an eigenvector corresponding with a smallest-valued eigenvalue or largest-valued eigenvalue of the symmetric form matrix.
- Example 59 the subject matter of any of Examples 56-58, and wherein determining the local differentiating color vector comprises normalizing the identified eigenvector to generate the local differentiating color vector.
- Example 60 the subject matter of any of Examples 56-59, and further including suppressing, by the computing device, spatial non-extreme responses of the total response of the multi-channel image.
- Example 61 includes a computing device comprising a processor; and a memory having stored therein a plurality of instructions that when executed by the processor cause the computing device to perform the method of any of Examples 56-60.
- Example 62 includes one or more machine-readable storage media comprising a plurality of instructions stored thereon that, in response to being executed, result in a computing device performing the method of any of Examples 56-60.
- Example 63 includes a computing device for multi-channel feature detection, the computing device comprising means for performing the method of any of Examples 56- 60.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2013/003104 WO2015079282A1 (en) | 2013-11-28 | 2013-11-28 | Method for determining local differentiating color for image feature detectors |
CN201380080597.9A CN105683996B (en) | 2013-11-28 | 2013-11-28 | Method for determining the local differential color of image feature detector |
EP13843069.9A EP3074925A1 (en) | 2013-11-28 | 2013-11-28 | Method for determining local differentiating color for image feature detectors |
KR1020167010385A KR101794465B1 (en) | 2013-11-28 | 2013-11-28 | Method for determining local differentiating color for image feature detectors |
JP2016526000A JP2016538630A (en) | 2013-11-28 | 2013-11-28 | A method for determining local discriminant colors for image feature detectors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2013/003104 WO2015079282A1 (en) | 2013-11-28 | 2013-11-28 | Method for determining local differentiating color for image feature detectors |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015079282A1 true WO2015079282A1 (en) | 2015-06-04 |
Family
ID=50434230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2013/003104 WO2015079282A1 (en) | 2013-11-28 | 2013-11-28 | Method for determining local differentiating color for image feature detectors |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP3074925A1 (en) |
JP (1) | JP2016538630A (en) |
KR (1) | KR101794465B1 (en) |
CN (1) | CN105683996B (en) |
WO (1) | WO2015079282A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110363747A (en) * | 2019-06-14 | 2019-10-22 | 平安科技(深圳)有限公司 | Intelligent abnormal cell judgment method, device and computer readable storage medium |
CN111931785A (en) * | 2020-06-19 | 2020-11-13 | 国网山西省电力公司吕梁供电公司 | Edge detection method for infrared image target of power equipment |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3496734B2 (en) * | 1994-04-30 | 2004-02-16 | ソニー株式会社 | Edge region detecting apparatus and method |
EP0916122A4 (en) * | 1997-04-04 | 2001-08-16 | Raytheon Co | Polynomial filters for higher order correlation and multi-input information fusion |
JP3659914B2 (en) * | 2000-10-31 | 2005-06-15 | 松下電器産業株式会社 | Object recognition apparatus, object recognition method, program, and recording medium |
JP2006285310A (en) * | 2005-03-31 | 2006-10-19 | Kanazawa Univ | Evaluation method of canopy of forest, and its canopy evaluation program |
US8098936B2 (en) * | 2007-01-12 | 2012-01-17 | Seiko Epson Corporation | Method and apparatus for detecting objects in an image |
CN101510299B (en) * | 2009-03-04 | 2011-07-20 | 上海大学 | Image self-adapting method based on vision significance |
JP5465001B2 (en) * | 2009-12-25 | 2014-04-09 | 三菱電機株式会社 | Target estimation device |
US9165201B2 (en) * | 2011-09-15 | 2015-10-20 | Xerox Corporation | Systems and methods for detecting cell phone usage by a vehicle operator |
KR101435730B1 (en) * | 2011-12-29 | 2014-09-01 | 인텔 코오퍼레이션 | Generalized robust multichannel feature detector |
CN102867301B (en) * | 2012-08-29 | 2015-01-28 | 西北工业大学 | Mehtod for getting image salient features according to information entropy |
-
2013
- 2013-11-28 WO PCT/IB2013/003104 patent/WO2015079282A1/en active Application Filing
- 2013-11-28 KR KR1020167010385A patent/KR101794465B1/en active IP Right Grant
- 2013-11-28 CN CN201380080597.9A patent/CN105683996B/en active Active
- 2013-11-28 JP JP2016526000A patent/JP2016538630A/en active Pending
- 2013-11-28 EP EP13843069.9A patent/EP3074925A1/en not_active Withdrawn
Non-Patent Citations (5)
Title |
---|
DAMIEN MUSELET ET AL: "Color Invariants for Object Recognition", 1 January 2013, ADVANCED COLOR IMAGE PROCESSING AND ANALYSIS, SPRINGER, NEW YORK, NY, USA, PAGE(S) 327 - 376, ISBN: 978-1-4419-6189-1, XP008170680 * |
KHANINA N A ET AL: "Scale-space color blob and ridge detection", PATTERN RECOGNITION AND IMAGE ANALYSIS, NAUKA/INTERPERIODICA, MO, vol. 22, no. 1, 27 March 2012 (2012-03-27), pages 221 - 227, XP035035724, ISSN: 1555-6212, DOI: 10.1134/S1054661812010221 * |
MATTHEW BROWN ET AL: "Spatio-chromatic decorrelation by shift-invariant filtering", COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2011 IEEE COMPUTER SOCIETY CONFERENCE ON, IEEE, 20 June 2011 (2011-06-20), pages 27 - 34, XP031926472, ISBN: 978-1-4577-0529-8, DOI: 10.1109/CVPRW.2011.5981688 * |
PAVEL SMIRNOV ET AL: "GRoM Generalized robust multichannel featur detector", SIGNAL AND IMAGE PROCESSING APPLICATIONS (ICSIPA), 2011 IEEE INTERNATIONAL CONFERENCE ON, IEEE, 16 November 2011 (2011-11-16), pages 585 - 590, XP032106935, ISBN: 978-1-4577-0243-3, DOI: 10.1109/ICSIPA.2011.6144155 * |
STOTTINGER J ET AL: "Sparse Color Interest Points for Image Retrieval and Object Categorization", IEEE TRANSACTIONS ON IMAGE PROCESSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 21, no. 5, 1 May 2012 (2012-05-01), pages 2681 - 2692, XP011492059, ISSN: 1057-7149, DOI: 10.1109/TIP.2012.2186143 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110363747A (en) * | 2019-06-14 | 2019-10-22 | 平安科技(深圳)有限公司 | Intelligent abnormal cell judgment method, device and computer readable storage medium |
CN111931785A (en) * | 2020-06-19 | 2020-11-13 | 国网山西省电力公司吕梁供电公司 | Edge detection method for infrared image target of power equipment |
Also Published As
Publication number | Publication date |
---|---|
KR101794465B1 (en) | 2017-11-06 |
KR20160060121A (en) | 2016-05-27 |
CN105683996A (en) | 2016-06-15 |
JP2016538630A (en) | 2016-12-08 |
CN105683996B (en) | 2019-10-25 |
EP3074925A1 (en) | 2016-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105144232B (en) | Image de-noising method and system | |
US10477128B2 (en) | Neighborhood haze density estimation for single-image dehaze | |
EP4109392A1 (en) | Image processing method and image processing device | |
US20140177947A1 (en) | System and method for generating training cases for image classification | |
US8929680B2 (en) | Method, apparatus and system for identifying distracting elements in an image | |
JP2018501675A (en) | Feature calculation in sensor element array | |
US10062002B2 (en) | Technologies for determining local differentiating color for image feature detectors | |
US11348248B1 (en) | Automatic image cropping systems and methods | |
US20170116765A1 (en) | Methods and systems for color processing of digital images | |
US20140050387A1 (en) | System and Method for Machine Vision Inspection | |
CN105118027B (en) | A kind of defogging method of image | |
CN110852233A (en) | Hand-off steering wheel detection and training method, terminal, device, medium, and system | |
WO2020259416A1 (en) | Image collection control method and apparatus, electronic device, and storage medium | |
CN106415596A (en) | Segmentation based image transform | |
US9779486B2 (en) | Image processing apparatus and image processing method | |
KR101794465B1 (en) | Method for determining local differentiating color for image feature detectors | |
WO2015010559A1 (en) | Devices, terminals and methods for image processing | |
CN113284063A (en) | Image processing method, image processing apparatus, electronic device, and readable storage medium | |
Buzzelli et al. | Consensus-driven illuminant estimation with GANs | |
US20180260929A1 (en) | Digital camera methods and devices optimized for computer vision applications | |
CA3087070A1 (en) | Backdrop color detection | |
EP3038057A1 (en) | Methods and systems for color processing of digital images | |
US11543644B2 (en) | Digital imaging device and method for generating a digital color image | |
US20230100268A1 (en) | Quantifying biotic damage on plant leaves, by convolutional neural networks | |
CN114004809A (en) | Skin image processing method, device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13843069 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2013843069 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013843069 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20167010385 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016526000 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |