GB2218507A - Digital data processing - Google Patents
Digital data processing Download PDFInfo
- Publication number
- GB2218507A GB2218507A GB8811223A GB8811223A GB2218507A GB 2218507 A GB2218507 A GB 2218507A GB 8811223 A GB8811223 A GB 8811223A GB 8811223 A GB8811223 A GB 8811223A GB 2218507 A GB2218507 A GB 2218507A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- edge
- corner
- digital data
- positions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/1961—Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/20—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
Abstract
A method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, and edge-like or a corner-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix. The processing method can be used for example in detecting images, for instance in a burglar alarm system or for the cytological scanning of cell material.
Description
DIGITAL DATA PROCESSING
This invention relates to digital data processing. It relates particularly to digital data information which is present in a stored digitised image matrix and to a method of and means for detecting significant features in the stored information.
When it is required to view a particular digital image automatically, it is usually necessary to identify corner and edge features since these correspond to the outlines of objects and to prominent surface markings which are present in the stored image.
This operation gives a quantity of low-level information which can be used for further work, but it is most important that the data extracted at this early stage should be of a reliable quality.
A description of the edge features present in an image is only helpful for certain parts of the image. For other parts of the image, the so-called image corners, corresponding to edge junctions and sharp bends in edges, and to isolated point-like features, are important. Where the kind of feature to be expected in the image is not restricted in any way, for example, in a depiction of a natural scene, then merely to describe the image in terms of edges alone or of corners alone would be incapable of providing understanding of the whole amount of the information present.
Some data processing methods have been proposed which will extract either edge or corner information from an image matrix. One edge extractor is the Canny edge filter. However, this device can suffer from an inability to form edge junctions and it may be necessary to perform an additional heuristic processing operation in order to overcome this problem. In addition, in textured regions of the image, edges can sometimes form a poor and unreliable descriptive feature and this therefore may lead one to prefer a corner description.
Some of the best available corner detectors have been proposed by Moravec, Nagel, Beaudet and Kitchen & Rosenfeld. These corner detectors are, however, not ideal since they can be temporally inconsistent in their responses (that is, they are unreliable), and they can respond too readily to the presence of small imperfections in strong image edges.
The present invention was devised in an attempt to overcome some of the disadvantages of both edge and corner feature detectors.
According to the invention, there is provided a method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, an edge-like or a comer-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix.
Conveniently, before said comparison stage, the original image matrix may be processed to form an intermediate simplified image which excludes unnecessary detail. This intermediate image may be a quaternary image.
In one embodiment, the method may include the further step of taking a second digitised image matrix after a given time interval has elapsed, and comparing the processed digital area of the said second image with that of a first image, such that any change in the positions of the selected features of the images will be detected.
By way of example, a particular embodiment of the invention will now be described with reference to the accompanying drawing the single Figure of which shows a block diagram of the main components of an image motion surveillance detector.
The detector device of the invention will first be described with reference to the main components shown in the Figure and the operation of the feature extraction stage will be explained later.
The surveillance detector comprises a video camera 1 which provides information for a digital data processor 2 and this acts to supply output signals on an output line 3. The processor 2 receives input signals from the video camera and these are delivered to an image digitisation stage 4. After this stage, the input information is passed into a frame store 6 and the pixels in this store are then processed in turn by a corner/edge feature extraction circuit 7.
The data obtained from the feature extraction circuit 7 is then applied to a feature matching block 8. In parallel with this, the data is also fed into a delay block 9 which will hold this data for a predetermined time interval before delivering it to the block 8.
At the feature matching block 8, the extracted features of the data stream are compared with the set of features which is compiled after the predetermined time interval. The information obtained is thus able to show any movement of the selected extracted feature which has taken place during the specified time interval. From this information it is possible to calculate the location and image velocity of a moving body which is viewed by the video camera 1.
In order to detect the presence of a sharp intensity variation in adjacent ones of the pixels representing the image matrix, it is first necessary to compare the intensity of light at one primary pixel with the intensities of each of the secondary pixels which surround it.
There may be at least twentyfour of the secondary pixels, or possibly several hundred A local auto-correlation function for each of the primary pixels is thus required and this will describe whether the local patch of image intensities, represented by the possible total of twentyfive pixels, is approximately constant in value, is edge-like, or is cornerlike. Corners are indicated by the auto-correlation function being sharply peaked, and edges by the auto-correlation function being ridge shaped. The explicit auto-correlation function may not need to be calculated, but only a determination of its second-order expansion about the origin may be necessary. A mathematical specification of these requirements will now be given.
Let the (possibly pre-smoothed) image intensities be represented by the array of values Ix,y, where x and y are the
Cartesian image coordinates. To start with, the two first gradients are calculated, thus
Xx,y = I x+1,y - 1x-1,y Yx,y = Ix, y+i - IX9Y-1 Next, the smoothed quadratic gradients are calculated
A = X2 * W
B =y2 * W
C=(X.Y)*W where * represents convolution, and W is a smoothing-filter, an explicit example being
(1 2 1) W = (2 4 2)
(1 2 1)
Finally, the corner/edge response, R, is calculated
R = (A.B-C2) - k(A+B)2 a value of 0.1 being typical for the parameter k.
The presence of a corner region is indicated by the value for R being large and positive. An individual pixel is deemed to be a corner if its response, R, is larger than the responses of each of its eight neighbouring pixels. Similarly, an edge region is indicated by R being large and negative. A pixel is deemed to be an edge pixel if its response, R, is smaller (more negative) than its two neighbours in either the x or y direction, depending on which of the first gradients,
X and Y, are larger in magnitude.
Standard edge clean-up algorithms (similar to those used in the
Canny edge filter) are applied to the edge image, to remove short lines and spurs, and to complete breaks in edges. The result is a quaternary (that is, a four-state) image, with each pixel classified as a corner, a corner neighbourhood, an edge or a background. The edges are thin (that is, they are one pixel in width) and run between the corner regions. The problem of junction formation is overcome by the presence of corner regions surrounding the corners, at which regions the edges terminate. The problem of edge inconsistency in textured image regions is overcome by their being represented by regions containing corners but few edges.
One way of effecting the necessary calculations would be by processing the data by means of an off-the-shelf microcomputer, with the algorithm being implemented in either high or low-level software. The input to the microcomputer would be digitised images, and the output would be the locations and image velocities of the required feature combinations. However, due to the number of calculations involved, use of a microcomputer would be relatively slow, and this slowness will be undesirable for time-critical applications.
For the time-critical applications, special-purpose hardware could be constructed to calculate the intermediate images X, Y, A, B, C and R, and to select appropriate local maxima or minima values of the corner/edge response R as corner or edge pixels respectively.
This proposed special-purpose hardware would be pipelined, and make use of convolution chips and other dedicated VLSI circuits.
This step would be followed by stages for edge clean-up and feature tracking on one or more microcomputers.
Where the scene viewed by the video camera can be expected to change with the passage of time, the invention can be used to process a second digitised image matrix after a given time interval has elapsed. The processed data of the said second image can then be compared with that of a first image so that any change in the positions of the selected features of the images will be detected.
Use of the digital data processing method has been proposed in the construction of passive equipment for surveying a scene, such as for a burglar alarm system. With imagery acquired from a static surveying camera, any consistent motion of edge and corner features will indicate a moving target whilst inconsistent motion may be due to image noise, wind-blown vegetation etc. In a traffic control application, the method can be used to detect the presence of moving vehicles. Since the digitised image could be produced from infra-red radiation rather than visible light, the system could still work effectively at night-time or in bad weather conditions.
Use of the data processing method for the automatic scanning of cytological cell material has also been proposed.
The foregoing description of an embodiment of the invention has been given by way of example only and a number of modifications may be made without departing from the scope of the invention as defined in the appended claims. For instance, although the invention has been described as a method of processing the data in a single image matrix, this would be equally applicable to processing material from two or more image matrices, so that a three-dimensional effect could be obtained.
Claims (5)
1. A method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, an edge-like or a corner-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix.
2. A method as claimed in Claim 1, in which before said comparison stage, the original image matrix is processed to form an intermediate simplified image from which unwanted detail has been excluded.
3. A method as claimed in Claim 2, in which the said intermediate image is a quaternary image.
4. A method as claimed in any one of Claims 1 to 3, including the further step of taking a second digitised image matrix after a given time interval has elapsed, and comparing the processed digital data of the said second image with that of a first image, such that any change in the positions of the selected features of the images will be detected.
5. A method of processing digital data in a digitised image matrix, substantially as hereinbefore described.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/GB1989/000523 WO1990014634A1 (en) | 1988-05-12 | 1989-05-15 | Digital data processing |
EP89906420A EP0449827A1 (en) | 1988-05-12 | 1989-05-15 | Digital data processing |
JP1506161A JPH03502261A (en) | 1988-05-12 | 1989-05-15 | digital data processing |
CA000607884A CA1333424C (en) | 1988-05-12 | 1989-08-09 | Digital data processing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/GB1989/000523 WO1990014634A1 (en) | 1988-05-12 | 1989-05-15 | Digital data processing |
CA000607884A CA1333424C (en) | 1988-05-12 | 1989-08-09 | Digital data processing |
Publications (3)
Publication Number | Publication Date |
---|---|
GB8811223D0 GB8811223D0 (en) | 1988-08-24 |
GB2218507A true GB2218507A (en) | 1989-11-15 |
GB2218507B GB2218507B (en) | 1992-02-26 |
Family
ID=4140434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB8811223A Expired - Lifetime GB2218507B (en) | 1988-05-12 | 1988-05-12 | Digital data processing |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2218507B (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2231740A (en) * | 1989-03-01 | 1990-11-21 | Hamamatsu Photonics Kk | Two dimensional incident position detector device for light or radiation |
WO1993005488A1 (en) * | 1991-09-12 | 1993-03-18 | Electronic Data Systems Corporation | Image analyser |
GB2272285A (en) * | 1992-06-10 | 1994-05-11 | Secr Defence | Determining the position of edges and corners in images. |
FR2699781A1 (en) * | 1992-12-21 | 1994-06-24 | Telecommunications Sa | Method for detecting the appearance of point objects in an image |
FR2717925A1 (en) * | 1994-03-26 | 1995-09-29 | Jenoptik Technologie Gmbh | A method of identifying defects when inspecting structured surfaces. |
US5734735A (en) * | 1996-06-07 | 1998-03-31 | Electronic Data Systems Corporation | Method and system for detecting the type of production media used to produce a video signal |
US5767923A (en) * | 1996-06-07 | 1998-06-16 | Electronic Data Systems Corporation | Method and system for detecting cuts in a video signal |
US5778108A (en) * | 1996-06-07 | 1998-07-07 | Electronic Data Systems Corporation | Method and system for detecting transitional markers such as uniform fields in a video signal |
AU699218B2 (en) * | 1991-09-12 | 1998-11-26 | Electronic Data Systems Corporation | Image analyser |
US5920360A (en) * | 1996-06-07 | 1999-07-06 | Electronic Data Systems Corporation | Method and system for detecting fade transitions in a video signal |
US5959697A (en) * | 1996-06-07 | 1999-09-28 | Electronic Data Systems Corporation | Method and system for detecting dissolve transitions in a video signal |
US6002431A (en) * | 1993-03-03 | 1999-12-14 | Goldstar Co., Ltd. | Video correction apparatus for camcorder |
US6061471A (en) * | 1996-06-07 | 2000-05-09 | Electronic Data Systems Corporation | Method and system for detecting uniform images in video signal |
WO2003046290A1 (en) | 2001-11-21 | 2003-06-05 | Roke Manor Research Limited | Detection of undesired objects on surfaces |
EP2309456A1 (en) | 2009-09-18 | 2011-04-13 | IMRA Europe S.A.S. | Algorithm for detecting contour points in an image |
WO2011080081A2 (en) | 2009-12-15 | 2011-07-07 | Uws Ventures Ltd. | Image processing |
-
1988
- 1988-05-12 GB GB8811223A patent/GB2218507B/en not_active Expired - Lifetime
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5012082A (en) * | 1989-03-01 | 1991-04-30 | Hamamatsu Photonics K.K. | Two-dimensional incident position detector device for light or radiation |
GB2231740B (en) * | 1989-03-01 | 1993-09-08 | Hamamatsu Photonics Kk | Two-dimensional incident position detector device for light or radiation |
GB2231740A (en) * | 1989-03-01 | 1990-11-21 | Hamamatsu Photonics Kk | Two dimensional incident position detector device for light or radiation |
AU699218B2 (en) * | 1991-09-12 | 1998-11-26 | Electronic Data Systems Corporation | Image analyser |
WO1993005488A1 (en) * | 1991-09-12 | 1993-03-18 | Electronic Data Systems Corporation | Image analyser |
AU662560B2 (en) * | 1991-09-12 | 1995-09-07 | Electronic Data Systems Corporation | Image analyser |
GB2272285A (en) * | 1992-06-10 | 1994-05-11 | Secr Defence | Determining the position of edges and corners in images. |
FR2699781A1 (en) * | 1992-12-21 | 1994-06-24 | Telecommunications Sa | Method for detecting the appearance of point objects in an image |
EP0604245A1 (en) * | 1992-12-21 | 1994-06-29 | SAT (Société Anonyme de Télécommunications) | Method for detecting the appearance of dot objects in an image |
US6002431A (en) * | 1993-03-03 | 1999-12-14 | Goldstar Co., Ltd. | Video correction apparatus for camcorder |
FR2717925A1 (en) * | 1994-03-26 | 1995-09-29 | Jenoptik Technologie Gmbh | A method of identifying defects when inspecting structured surfaces. |
US5767923A (en) * | 1996-06-07 | 1998-06-16 | Electronic Data Systems Corporation | Method and system for detecting cuts in a video signal |
US5778108A (en) * | 1996-06-07 | 1998-07-07 | Electronic Data Systems Corporation | Method and system for detecting transitional markers such as uniform fields in a video signal |
US5734735A (en) * | 1996-06-07 | 1998-03-31 | Electronic Data Systems Corporation | Method and system for detecting the type of production media used to produce a video signal |
US5920360A (en) * | 1996-06-07 | 1999-07-06 | Electronic Data Systems Corporation | Method and system for detecting fade transitions in a video signal |
US5959697A (en) * | 1996-06-07 | 1999-09-28 | Electronic Data Systems Corporation | Method and system for detecting dissolve transitions in a video signal |
US6061471A (en) * | 1996-06-07 | 2000-05-09 | Electronic Data Systems Corporation | Method and system for detecting uniform images in video signal |
WO2003046290A1 (en) | 2001-11-21 | 2003-06-05 | Roke Manor Research Limited | Detection of undesired objects on surfaces |
EP2309456A1 (en) | 2009-09-18 | 2011-04-13 | IMRA Europe S.A.S. | Algorithm for detecting contour points in an image |
WO2011080081A2 (en) | 2009-12-15 | 2011-07-07 | Uws Ventures Ltd. | Image processing |
Also Published As
Publication number | Publication date |
---|---|
GB8811223D0 (en) | 1988-08-24 |
GB2218507B (en) | 1992-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9196043B2 (en) | Image processing apparatus and method | |
GB2218507A (en) | Digital data processing | |
US6687419B1 (en) | Automatic image montage system | |
US5890808A (en) | Image processing method and apparatus for correlating a test image with a template | |
JP2863818B2 (en) | Moving image change point detection method | |
EP0506327B1 (en) | A system and method for ranking and extracting salient contours for target recognition | |
US7430303B2 (en) | Target detection method and system | |
US6226388B1 (en) | Method and apparatus for object tracking for automatic controls in video devices | |
US7397970B2 (en) | Automatic scene correlation and identification | |
US5341439A (en) | System for texture-based automatic detection of man-made objects in representations of sensed natural environmental scenes | |
US7181047B2 (en) | Methods and apparatus for identifying and localizing an area of relative movement in a scene | |
US5063524A (en) | Method for estimating the motion of at least one target in a sequence of images and device to implement this method | |
EP1105842B1 (en) | Image processing apparatus | |
US20200394802A1 (en) | Real-time object detection method for multiple camera images using frame segmentation and intelligent detection pool | |
US4242734A (en) | Image corner detector using Haar coefficients | |
WO1990014634A1 (en) | Digital data processing | |
KR940003654B1 (en) | Method of processing digital data | |
Sanders-Reed et al. | Multi-target tracking in clutter | |
CA1333424C (en) | Digital data processing | |
Köhn et al. | Automatic Building Extraction and Roof Reconstruction in 3k Imagery Based on Line Segments | |
JPH0546734A (en) | Method for recognizing pattern | |
Kahng et al. | Model-based approach to Landsat Thematic Mapper (TM) scene linear lines of communication (LOC) segment detection using morphology | |
Raimondi et al. | Performance Of Image Enhancement And Target Cueing Techniques On Grafenwoehr II Imagery | |
Merritt | Video tracking of objects on an enhanced PC system | |
Dhellier | Fast target localization method for aerial image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732 | Registration of transactions, instruments or events in the register (sect. 32/1977) | ||
PE20 | Patent expired after termination of 20 years |
Expiry date: 20080511 |