NL2005197C2 - Computer controlled evaluation of binary images. - Google Patents

Computer controlled evaluation of binary images. Download PDF

Info

Publication number
NL2005197C2
NL2005197C2 NL2005197A NL2005197A NL2005197C2 NL 2005197 C2 NL2005197 C2 NL 2005197C2 NL 2005197 A NL2005197 A NL 2005197A NL 2005197 A NL2005197 A NL 2005197A NL 2005197 C2 NL2005197 C2 NL 2005197C2
Authority
NL
Netherlands
Prior art keywords
image
computer
distance
distance map
evaluated
Prior art date
Application number
NL2005197A
Other languages
Dutch (nl)
Inventor
Theodorus Everardus Schouten
Egidius Leon Broek
Original Assignee
Stichting Katholieke Univ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stichting Katholieke Univ filed Critical Stichting Katholieke Univ
Priority to NL2005197A priority Critical patent/NL2005197C2/en
Application granted granted Critical
Publication of NL2005197C2 publication Critical patent/NL2005197C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Description

TITLE
Computer controlled evaluation of binary images.
TECHNICAL FIELD
5 The present invention relates to computer controlled image processing and, in particular, to computer controlled evaluation of two dimensional, 2D, and three dimensional, 3D, binary images including sequences of images using a distance map.
10 BACKGROUND
Digital imaging is developed in the 1960s and 1970s as part of the space program, in particular to advance satellite imaging, and in medical research for medical imaging. Other areas of digital imaging are videophone technology, character recognition, and photo and film recordings.
15
Rapid advances in digital imaging began with the introduction of microprocessors, alongside progress in related storage and display technologies. The development of Charge Coupled Devices, CCDs, for use in a wide range of image capture devices paved the way for present large scale commercial use of 20 digital imaging technology. Besides 2D images, more and more 3D images are used.
Two dimensional digital images are composed of a raster or a grid of picture elements, called pixels. Each pixel is a sample of an original image. The picture elements of a three-dimensional digital image are called voxels. A voxel is a 25 sample of a 3D digital image. Each 3D image is composed of a spatial raster or grid of voxels.
A digital image that has only two possible values for each picture element is called a binary image. Typically the colors black and white are used for a 30 binary image, though any two different colors can be used. The color used for the object(s) in the binary image is called the foreground color while the rest of the image is termed the background color.
Besides traditional photography, 2D and 3D digital imaging 2 technology is applied in vast number of technical applications, including monitoring, surveillance, face recognition, distance measurements, photogrammetry, computer vision, and many more. For technical use, the images need to be processed and evaluated for object or pattern recognition and pattern matching, in particular with 5 respect to distances between objects captured by a digital image.
Distance Transformation, DT, takes a binary image as input and generates a Distance Image, Dl, or distance map in which to each picture element of the binary image a value is assigned that represents a minimum distance of the 10 picture element to an object picture element in the binary image, given a particular distance metric. An object picture element is a picture element of a set of picture elements representing an object in the binary image.
In practice, distance maps may be used to locate an object within a 15 binary image, to calculate distances between objects and for optical character recognition. Other applications include, for example, medical imaging, robot navigation, and generating structural skeletons of an object. Algorithms for calculating distances between objects based on distance maps are generally known in the art.
20
Several methods for calculating distance maps are widely available in the field, each using a particular metric for expressing the distance. For example, the “City-block” method measures the so-called L1 length of a distance of a background picture element to a foreground picture element, i.e. an object picture 25 element, calculated in picture element increments. Euclidean methods are based on the so-called L2 length, in which the distance to a picture element at the peripheral of an object is calculated based on the Pythagorean theorem. Other distance transforms include the Chess board, Chamfer, and squared Euclidean metric, for example.
30
The Fast Exact Euclidean Distance, FEED, method for calculating distance maps is based on the Euclidian Distance, ED. With FEED, each picture element in the set of object picture elements feeds its distance to all picture elements in an image and the minimum of the received distances is calculated. To 3 speed up the distance calculations, FEED recites not to take into account every picture element in the set of objects, but only those picture elements in the set of objects which are picture elements at the peripheral or border of the object.
5 Peripheral picture elements or peripheral pixels of an object in a 2D
image are construed as those object picture elements or object pixels that have at least one of their eight neighbor or eight connected pixels (i.e. the direct neighbor pixels in horizontal, vertical and the two diagonal directions) in a regular rectangular raster of pixels not belonging to the object itself. The set of peripheral pixels of an 10 object are positioned at and, as such, defines its outer part, also called contour or edge.
Peripheral picture elements or peripheral voxels of an object in a 3D image are construed as those object picture elements or object voxels that have at 15 least one of their maximum of twenty six neighbor or twenty six connected voxels. That is, in a cubic raster of voxels not belonging to the object itself, an object voxel has three types of neighbors: 6 face, 12 edge, and 8 vertex neighbors. Depend on the complexity of the object in the 3D image, fewer neighbor or connected voxels may be used in the calculation, such as 6, 12 or 18 voxels. The set of peripheral 20 voxels of an object are positioned at and, as such, defines its outer part, also called border or circumference.
Compared to other methods, the speed gain in performing the FEED method is considerable.
25
For calculating distance maps based on the Euclidian distance, global picture information has to be included. For calculating distance maps not based on an Euclidian metric, local picture information suffices. That is, such distance maps can be calculated by a raster scan over the image to propagate 30 distance using local information only, causing these methods to be executed more quickly than the Euclidian based calculations.
In practice there is an increasing demand for faster image evaluation due to the increased amount of picture elements in an image, the 4 increase in speed with which images can be captured and transmitted, and the processing of video sequences, and the growing number of applications for which fast image evaluation is required, such as face recognition, robot navigation, video surveillance or other computer vision, identification and authorization applications.
5 These applications often require processing of large sequences of frames with stationary objects and one or more moving or non-stationary objects.
Fast and efficient evaluation of binary images becomes more and more important, in particular for the evaluation of binary images and sequences of 10 binary images, such as video sequences, based on an Euclidian metric.
SUMMARY
It is an object of the present invention to provide an improved method for computer controlled evaluation of binary images including sequences of 15 images.
It is another object of the present invention to provide an evaluation device operating in accordance with the improved method.
20 For the purpose of the present description, an object is construed to be a limited set of picture elements, i.e. pixels or voxels, in an area of a binary image. A stationary object is construed to be an object that has exactly the same shape and size and is located at exactly the same location compared to an object in a reference image. A reference image is a binary image that is entirely comprised of 25 stationary objects. A non-stationary object is construed to be an object in an image which differs compared to any object in a reference image, either in shape, size and/or location.
In a first aspect there is provided a computer controlled image 30 evaluation method evaluating a binary image representing stationary and non-stationary objects, the method comprising the steps of: a) retrieving, by the computer, an image to be evaluated; b) comparing, by the computer, the image to be evaluated and a binary reference image representing the stationary objects, and identifying stationary and 5 non-stationary objects in the image to be evaluated; c) determining, by the computer, a maximum value in a distance map of the reference image; d) identifying, by the computer, for each non-stationary object, a 5 calculation area surrounding each non-stationary object in the image to be evaluated based on the maximum value in the distance map of the reference image; e) generating, by the computer, a sub-distance map for each calculation area; f) generating, by the computer, a distance map of the image to be 10 evaluated based on each sub-distance map and the distance map of the reference image; g) storing, by the computer, the distance map of the image to be evaluated in an electronic storage unit, and h) generating an event signal based on the distance map of the image 15 to be evaluated.
With the method disclosed above, a significant improvement is achieved in the speed by which a binary image, and hence a sequence of images, such as a video sequence, can be evaluated. This, due to the insight that the 20 distance map of the image to be evaluated, i.e. the image representing both the stationary and non-stationary objects, can be generated from the distance map of the reference image representing the stationary objects and one or more subdistance maps relating to a calculation area surrounding a respective non-stationary object in the image to be evaluated. That is, in the disclosed method, only part of the 25 image to be evaluated has to be taken into account for calculating the distance map of an image each time a non-stationary object is detected in the image. It will be appreciated that the distance map of the reference image needs to be generated once.
30 Mathematically spoken, in a calculation area surrounding a non- stationary object and which calculation area is bounded by the maximum value of the distance map of the reference image, picture elements may have a distance to the non-stationary object less than their value in the distance map of the reference image. All other picture elements of the binary image always have a distance to a 6 stationary object less than or equal to the maximum value and their distance is not affected by the non-stationary object Accordingly, the calculation of the distance map of the image to be evaluated can be restricted to the picture elements in such calculation area.
5
That is, the distance map of the image to be evaluated is generated in an incremental manner in that the distance map is build from the distance map of the reference image relating to the stationary object(s) and the increment(s) or subdistance map(s) relating to the non-stationary object(s). Note that the concept 10 “distance” as used above depends, of course, on the metrics defined in a respective Distance Transform method by which the distance map of the reference image and the sub-distance map of each calculation area of the image to be evaluated are generated.
15 The present incremental method, hereinafter called Incremental
Distance Transform, IDT, is applicable with a plurality of Distance Transforms, such as, but not limited to the FEED method, the City block distance map method, the Chamfer distance map method, an exact Euclidean distance map method for arbitrary dimensions, EDT, and the Linear-time Legendre Transform, LLT, distance 20 map method, in both 2D and 3D images.
The event signal to be generated may take any suitable form. The signal may be used for executing a particular application, for driving a device, and as a distance measuring signal, for example.
25
Compared to the prior art, not only the speed with which a binary image or a sequence of binary images is evaluated is significantly improved by the invention, but also the amount of electric power involved. That is, the method according to invention requires less computational power and, accordingly, less 30 electric power. This is a very important advantage when, for example, applying the invention in portable equipment wherein battery power is scarce.
In a further aspect, the representations of the stationary and non-stationary objects in the image to be evaluated are construed to be enclosed by 7 object peripherals and the calculation area is construed to be enclosed by an area peripheral. The calculation area comprises all picture elements between peripheral picture elements of the area peripheral and peripheral picture elements of the corresponding non-stationary object. The peripheral picture elements of the area 5 peripheral have a minimum metric distance to the peripheral picture elements of a corresponding non-stationary object which is equal to or larger than the maximum value in the distance map of the reference image.
Ideally, the calculation area is the area surrounding a non-stationary 10 object in which the shortest distance of each picture element in the calculation area to each picture element of the non-stationary object is smaller than the maximum value in the reference image, given the metric used.
In a further aspect step f) of the above mentioned computer 15 controlled image evaluation method comprises updating, by the computer, the distance map of the reference image with each sub-distance map.
Generally, the distance map of the reference image is generated beforehand. From the distance map of the reference image and the calculated sub-20 distance map or maps, the distance map of the image to be evaluated can be produced by simply replacing, by the computer, respective values of pixels or voxels in the distance map of the reference image by the corresponding values of the respective sub-distance map or maps.
25 In another aspect of the invention, step c) of the above mentioned computer controlled image evaluation method comprises retrieving, by the computer, the maximum value from a distance map of a reference image containing only and as much as possible of the stationary objects.
30 Whether an object is determined to be a stationary or a non- stationary depends on the reference image. All objects in the reference image are by definition stationary objects to the binary image to be evaluated. All other objects in the binary image to be evaluated are determined to be non-stationary objects.
8 A distance map of a reference image representing as much as possible stationary objects contributes as much as possible to the efficiency in generating the distance map of the image to be evaluated. If not all stationary objects in an image are identified as such, the non-identified objects will be treated 5 as non-stationary objects.
Instead of a evaluating one single binary image, the computer controlled image evaluation method disclosed above may be applied to a sequence of consecutive binary images, such as a video signal comprised of a plurality of 10 consecutive binary images or frames. The term video signal used in the present application is to be appreciated as encompassing all types of sequences of consecutive images, such as conventional video sequences having 50 or 60 frames per second as well as modern digital video standards, either 2D and/or 3D.
15 In such a sequence of consecutive binary images, according to a further aspect, an image of the sequence may be selected as the reference image. By selecting a reference image representing as much as possible stationary objects, the greatest computational efficiency will be achieved. Because a non-stationary object in a previous image can be a stationary object for a consecutive image if that 20 the object has not moved, for example, for the purpose of the invention this previous image may be selected as the reference image for the consecutive image to be evaluated. The distance map of the thus assigned reference image is, of course, the distance map generated for the previous image.
25 On the other hand, if a stationary object moves, a new reference image has to be assigned only representing the stationary objects that remain stationary. The moved stationary object, by definition, has become a non-stationary object for the purpose of the present invention. The distance map of the thus assigned reference image has to be calculated anew, provided there is a previous 30 distance map available relating to the stationary objects that kept stationary.
For using this dynamic reference image assignment, in a further aspect, the computer is arranged for comparing images of a sequence of images for detecting whether content of the images has been changed. Content that has been 9 changed refers to non-stationary objects.
For the detection of objects in a binary image, object recognition techniques known in practice can be applied, ranging from a very coarse object 5 recognition to a more detailed object recognition including detailed shape information.
Initially, the reference image may be generated from an image of a sequence of images, for example, by designating objects in the image to be 10 stationary objects. However, a reference image may also be a pre-stored reference image.
In a video signal, for example, the reference image can be any image comprised in the series of images forming the video signal. In the case that 15 the video signal is generated by, for example, a video surveillance unit which monitors traffic, the reference image may be identified as being an image of the series of images containing no vehicles or other transport means, for example.
The computer controlled image evaluation method disclosed above 20 is particularly efficient with distance map transforms based on an Euclidian distance metric. To reduce computation time, it is advantageous to generated the distance map of the reference image and the sub-distance maps, i.e. the increments of the reference image, on the basis of squares of the Euclidian distance. This effectively avoids extraction of roots, which is a computational relative time consuming 25 operation.
A distance map can be used as a basis for determining distances between objects represented by the respective binary image. For example the distance between a stationary object and a non-stationary object or between two 30 non-stationary objects.
In a further aspect, the event signal is generated when a minimum value of such distance is below a predetermined minimum value. That is, a non-stationary object is too close to or too near to a stationary object or another non- 10 stationary object.
In another aspect, the event signal is generated when a maximum value of such distance exceeds a predetermined maximum value. That is, a non-5 stationary object is too far away from a stationary object or another non-stationary object.
The person skilled in the art will appreciate that other thresholds may be set for generating the event signal, such as a range of thresholds and each 10 range being assigned with a different type of event signal, for example.
The event signal may be generated by the computer or any other suitable device, based on the distance map of the binary image to be evaluated stored in the electronic storage unit. The electronic storage unit including the 15 distance map of the binary image is a concrete and tangible result of the invention, to be used by a plurality of devices and in a plurality of technical applications.
Applications of the invention include, but are not limited to, medical imaging and in particular real-time medical imaging where speed is of utmost 20 importance, microscopic image analysis, X-ray CT scanning, remote satellite sensing, real-time virtual reality in gaming and visualization, video surveillance and (robot) navigation.
In a further aspect a binary image evaluation device is provided, 25 comprising a binary image input unit, a binary image processing unit, an image content comparison and object recognition unit, an electronic storage unit and an output unit, wherein the image processing unit and the image content comparison and object recognition unit are arranged for executing the computer controlled image evaluation method disclosed above on a binary image supplied to the image 30 input and for storing a distance map of the binary image in the electronic storage unit and for providing an event signal at the output unit.
The event signal may be generated external to the processing unit from the distance map stored in the electronic storage unit.
11
In an embodiment, the binary image is provided by at least one digital camera connecting to the image input unit. The camera can be a general purpose digital camera or a special digital camera such as used in medical techniques, surveillance techniques, traffic monitoring and many more. The binary 5 digital image may be a sequence of binary images provided by the digital camera.
The evaluation device according to the invention may be provided as an integrated electronic circuit, among which a Very Large Scale Integrated circuit, VLSI, and an Application Specific Integrated electronic Circuit, ASIC, or the 10 like.
The invention further relates to a computer program product, comprising computer readable code stored on a tangible, computer readable medium, wherein the code, when loaded in a working memory of a computer, is 15 arranged for executing the computer controlled image evaluation method disclosed above.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1a and Fig. 1b show, in a very schematic manner, an example 20 of a two dimensional reference image and a two dimensional binary image to be evaluated in accordance with the present invention.
Fig. 2 shows a flow-chart diagram of an embodiment of the computer controlled image evaluation method according to the invention.
25
Figure 3 shows, in a very schematic manner, a two dimensional binary image used in a benchmark.
Figure 4 shows a very schematic block diagram of an example of an 30 image evaluation device according to the invention.
DETAILED DESCRIPTION OF THE DRAWINGS
Next, the present invention is illustrated in further detail with reference to examples of two dimensional, 2D, image evaluation, referring to pixels 12 as the 2D picture elements. It is however noted that same applies in a like manner to three dimensional, 3D, image evaluation, having voxels as picture elements. Reference is made to a publication by Schouten, Th.E., Kuppens, H.C., and Broek, E.L. van den. (2006), “Three Dimensional Fast Exact Euclidean Distance (3D-FEED) 5 Maps”, Proceedings of SPIE (Vision Geometry), 6066, p. 0F-1-0F-10. Those skilled in the art will appreciate that the method according to the invention is easily extended to include more than three dimensions.
Figs. 1a and 1b schematically illustrate an example of two binary 10 images 1 and 10 for use with the present invention. Both images are composed of a raster or grid of pixels (not shown).
Fig. 1a shows a binary image 1 representing a first stationary object 2 having an outer object peripheral or contour 3, and a second stationary object 4 15 having an outer object peripheral or contour 5. This image serves as a reference image for the purpose of the present invention. For clarification purposes, the shape of the objects 2 and 4 has been arbitrarily chosen. In practice, the objects may have any shape and the number of objects is, of course, not limited to two.
20 Fig. 1b shows a binary image 10 to be evaluated, representing two stationary objects 2, 4 and one non-stationary object 6. The stationary objects 2, 4 are referred to as stationary because these objects have exactly the same shape and size and are located at exactly the same location compared to the objects 2, 4 of the reference image 1. Object 6 is a non-stationary object, because the position, 25 shape and size of this object differ compared to any object in the reference image 1. Object 6 has an outer object peripheral or contour 7. Likewise, for clarification purposes, the shape of the non-stationary object 6 has been arbitrarily chosen. In practice, this object may have any shape and the number of non-stationary objects is, of course, not limited to one.
30
The present invention is based on the calculation of distance maps. Given a particular distance metric, the distance map provides for each pixel of a binary image its minimum distance value to an object pixel in the binary image. An object pixel is a pixel of a set of pixels representing an object in the binary image. As 13 mentioned in the introductory part, several computer controlled transformation techniques for calculating or generating a distance map of a binary image are known pers se.
5 Fig. 2 illustrates a flow chart 30 of an embodiment of a computer controlled evaluation method according to the invention.
With reference to Figs. 1a and 1b, for computer controlled evaluation of the image 10, in a first step 20, “Receive image”, the image 10 to be 10 evaluated is received by the computer. Note that the image to be evaluated can be a single binary image or an image of a sequence of consecutive binary images, such as a video signal.
In a second step 21, “Detect and compare objects image and 15 reference image”, objects represented in the image 10 to be evaluated are detected and compared with a reference image 1. The reference image can be a known image stored beforehand or an image dynamically selected from a sequence of images to be evaluated. For detecting objects in the image to be evaluated, image content and object recognition techniques known in practice can be used. As a 20 result of this step, the stationary objects 2 and 4 and the non-stationary object 6 represented by the image 10 to be evaluated are identified, as illustrated with block 22, “Stationary and non-stationary objects of the image to be evaluated”.
In a third step 23, “Retrieve distance map reference image”, a 25 distance map of the reference image is retrieved by the computer . If this distance map is not available beforehand, same may be calculated by the computer. The calculation of the distance map, of course, depends on the type of Distance Transform used in a particular application.
30 In step 24, “Determine maximum value”, the maximum value of the distance map of the reference image, is determined. In practice, the maximum value of the distance map of the reference image may be available in a look-up table, for example.
14
Next, in step 25, “Identify calculation area(s)”, a calculation area 8 surrounding the non-stationary object 6 is identified with respect to the image 10. The calculation area 8 is bounded by an outer area peripheral or contour 9, shown in dashed lines. The inner peripheral or contour of the calculation area 8 is formed by 5 the outer contour 7 of the non-stationary object 6. Ideally, the shortest distance from a contour pixel of the outer area contour 9 of the calculation area 8 to a contour pixel of the object contour 7 equals the maximum value of the distance map of the reference image 1. In practice, this shortest distance may be slightly larger but not smaller than the maximum value of the distance map of the reference image 1. This, 10 of course, given a particular distance metric.
For the pixels of the binary image 10 in the identified calculation area 8, in a subsequent step 26, “Generate sub-distance map(s)” the computer calculates a distance map, called sub-distance map or increment, in accordance 15 with the distance transform technique used for the calculation of the distance map of the reference image 1.
Next, in step 27, “Generate distance map image”, the computer controlled evaluation method generates the distance map of the image 10 by 20 updating the distance map of the reference image 1 with the sub-distance map or sub-distance maps or increments calculated in step 26. That is, instead of calculating the distance map of the image 10 as a whole, the distance map of the image 1 is created from the distance map of the reference image 1 incremented by the sub-distance map of the calculation area 8.
25
Those skilled in the art will appreciate that the calculation of a subdistance map for the pixels in a calculation area 8 of each non-stationary object 6 is less time consuming than calculating a distance map for all the pixels of the image 10. Updating the distance map of the reference image 1 can be easily performed by 30 replacing the values of the respective pixels of the calculation area 8 in the distance map of the reference image 1 by the values of the corresponding pixels of the calculated sub-distance map.
The thus formed distance map of the image 10 is stored in an 15 electronic storage unit, step 28, “Store distance map”, and provides a tangible product that may be used in a plurality of applications for which the distance of objects has to be evaluated. Dependent on set criteria an event signal may be generated by the computer by which the image 10 is evaluated and/or by any other 5 device, whether or not remote from the electronic storage unit, having access to the stored distance map of the image 10. This, as illustrated by step 29, “Generate event signal”. The event signal to be generated may take any suitable form. The signal may be used for executing a particular application, for driving a device, and as a distance measuring signal, for example.
10
When evaluating a sequence of images, the steps 21-29 are repeated for each subsequent image. In such a case, the reference image in step 22 may be selected as an image of the sequence containing only and as much as possible stationary objects compared to the image to be evaluated.
15
To clarify the step of the incremental distance transform method in more detail, reference is made to the below mathematical explanation for a 2D image.
20 As previously explained, a Distance Transformation, DT, converts a binary image into a distance image, Dl, or distance map. In a distance map, to each pixel of the binary image a value is assigned that represents a minimum distance of the pixel to a set O of object pixels o in the binary image, given a particular distance metric.
25
Such a DT is calculated by the computer as follows: DI(b) = min D(b,o) (1)
oeO
wherein D can be any metric and b is a pixel, also called a 30 background pixel, of the image which pixel is not in a set of object pixels.
In an example, first the distance for pixels which are in the set O of object pixels is initialized to zero by the computer, using: 16 DI(b) = if(bEO) then 0 else co
In other words, if the pixel b under consideration is a pixel belonging to a set of object pixels representing a particular object, the distance from that pixel 5 to this object is, of course, zero. Otherwise the distance is not yet defined, and in this case is set to infinity. As will be understood, infinity is merely a state in which the distance of the pixel to the set of object pixels is not yet defined. It can therefor be any number or value as long as that number is considerably large compared to the expected maximum value in the distance map.
10
Next, the distances of all pixels which are not in the set of object pixels are determined by the computer. The FEED algorithm, for example, calculates an Euclidian distance directly starting from the definition as presented in equation (1) above. An adapted naive FEED algorithm then reads: 15
for each oeO
determine: Ao update: for each a eAo do ^ DI(a) = vam^DI{a),ED2{o,a)^ wherein Aa is the calculation area where o feeds distances to.
In this example the square of the Euclidian Distance, ED, is used to avoid square roots, so that the calculation is performed using integers only. This 20 provides computational efficiency.
As will be appreciated, any metric can be used for calculating the distance of a pixel of a binary image to an object pixel. The efficiency of the method shown in equation (3) can be further improved using the fact that only the peripheral 25 or contour pixels of an object have to be considered, since 17 irdn{ ED(b,o)\ == ED{b,oh) . (4) wherein ob is a contour pixel of O', i.e., a pixel in which at least one of its eight connected pixels is not in the set of objects. Using the above mentioned 5 method for generating a distance map of an image, the efficiency can be improved even further using stationary and non-stationary objects.
First, an image containing only stationary objects is construed, by the computer, as 10 DI,^(b)= DIt(b)= DT with O, (5) wherein DT in this case is the distance transform according to the algorithm defined in equation (3) and Os is the set of stationary objects. This 15 initializes DIs+n which is the combined distance map DI for the stationary objects (s) and the non-stationary objects (n). Subsequently, equation (3) is applied again by the computer, with some adaptions, using: for each o <EOn 20 determine: Ασ update: for each a eAa do DIs+n (a) = min (a), ED*(οα)} 25 (6) wherein On is the set of non-stationary objects.
The key aspect of the method explained in equation (6) is that the 30 calculation area Aa is restricted to a circle with radius dmax which is the maximum ED in DIS. This is because combining On and Os can only decrease the maximum ED, compared to the maximum ED in Os only. Note that in a 3D image, the calculation area is restricted to a sphere with radius dmax.
18
When using the incremental distance forms in combination with the City-block and Chamfer DT, for example, initialization is performed as follows: lfl[0][0] e O then DI[0][0] = 0 else DI[0][0] = ~ (7) 5 wherein I[y][x] is the image to be evaluated and DI[y][x] is its DI.
Next, a forward raster scan is conducted: 10 if I[y][x] e O then DI[y][x] = 0 else DI[y][x] = min {l+DI[y][x-1],1 + DI[y-l][x]} (8)
Subsequently, a backward raster scan is applied: 15 if DI[y][x] > 0 then DI[y][x] = min {DI[y][x],l+DI[y][x+l],l + DI[y+l][x]} (9)
For the IDT based on city-block, first the area is determined over which On can change the DI. For this the bounding box (bb) of On is determined and, 20 subsequently, extended in all directions with dmax, i.e. the maximum distance in DIS.
Then, inside this bb, the maximum occurring distance is determined. Next, the bb of On is enlarged with that distance to provide the calculation over which the IDT implementation of City-block must be applied.
25
First the IDT on the City-block algorithm is initialized, using Equation (5). Second, Equation 8 and 9 are applied on On instead of O. The forward raster scan starts at the lowest [yj[xj point of the bb of On and for the next lines start with that xs. It can be stopped as soon as a scanline [yj with no changes in 30 DI[y][x] has been found. The backward scan is stopped as soon as a scanline below On with no changes in DI is detected.
The IDT for Chamfer is similar to the IDT for City block. However, because also the two diagonal neighboring pixels are taken into account in the 19 raster scans, a larger part of the calculation area has to be covered.
For LLT and EDT, for example, DIS and DIn are calculated separately. DIs is initialized through Equation (5) with DT being LLT or EDT, 5 respectively. Next the full original algorithm is applied on the rectangular calculation area over which the non-stationary object can have an effect, as described above, which results in DIn. Subsequently, using the min operator, both DI are combined to provide the final DI.
10 In a benchmark, five 2D-distance transforms techniques are compared with each other. All five methods are implemented using single precision (32 bit) integers, producing their output in a format fastest for each method: ED2 for FEED, EDT and LLT; EDx3 for Chamfer; and ED for City block.
15 The times measured for the different methods include the time for reuse of the distances of pixels for subsequent frames. This is achieved by restoring the part changed by the method back from a changed copy of calculated distances. The benchmark is executed on a PC with Intel Core 2 Duo E6550P® 2.33GHz processor (2x 32KBdata and 2 x 32 KB instruction L1 cache, 4096 KB L2 cache) and 20 1024 MB memory, using the gcc compiler. No effort was spend to reduce the execution time; e.g., by varying compiler optimization parameters or using pointers instead of indices.
The benchmark has been performed for a sequence of 120 images, 25 of size 652 x 492 pixels. Fig. 3 shows an example of a binary image 32 of this sequence, comprising a plurality of stationary objects 33, having different shapes, sizes and positions, and one non-stationary object, i.e. triangular object 31.
The input image was split in two images. One for the program parts 30 handling On, with 0 and 255 indicating respectively non-stationary object pixels and the background. The other for the program parts handling Os, with 0, 127, and 255 indicating a pixel from respectively Os, On and the background.
The result of the benchmark is that the Incremental Distance 20
Transform, IDT, method provides a significant decrease in the computer calculation time for generating the distance map of the image and accordingly a considerable faster image evaluation compared to the standard implementation, using the same distance transform technique and metric.
5
For example, with FEED as the transform technique, the incremental implementation according to the invention is at least 5 times faster compared to the prior art FEED. Applying EDT, LLT and Chamfer the IDT is about 7 times faster compared to the prior art EDT, LLT and Chamfer. The table below 10 presents the times in ns/pixel for providing the distance map for the image shown in Fig. 3.
Table
Prior art implementation Incremental implementation* 15 FEED EDT LLT Chamfer City FEED* EDT* LLT* Chamfer* City* 7.1 32,9 33.5 10.3 7.5 0.6 4.7 4.6 1.5 1.0
The left hand part of the table represents the prior art or standard implementation of a particular distance transformation method. The right hand part 20 relates to the IDT according to the invention, marked with an *.
Par excellence, the invention can be used for the evaluation of video sequences, such as but not limited to video surveillance, for image evaluation in robot navigation, data analysis, gaming, skeletonization, for medical imaging 25 among others functional Magnetic Resonance Imaging, fMRI, neuromorphometry, EEG, modeling radiation therapy, microscopic image analysis, volume rendering, pattern recognition.
For FEED the obtained speed gain is largest. This is mainly due to 30 the fact that to obtain the final result, only the peripheral or contour pixels of the non-stationary object have to feed their EDs in a restricted area. With FEED on full images already being faster than the other two exact EDs, FEED’S computer controlled evaluation method according to an implementation of the invention is 21 shown to be at least ten times faster. Other DT implemented in accordance wit the invention provide a speed gain of at least six times. It will be appreciated that the timings depend on the content of the images.
5 Fig. 4 shows a very schematic block diagram of a typical example of a binary image evaluation device 40 according to the invention.
The binary image evaluation device 40 comprises a processing unit 41, an input unit 42 having an input 43 for feeding binary images to the processing 10 unit 41. An output unit 44 connects to the processing unit 41 for providing an event signal at an output 45 as a result of an evaluation of a binary image or a sequence of consecutive binary images provided at the input 43, such as provided by a digital camera 47 or a plurality of digital cameras. The processing unit 41 further connects to an electronic storage unit 46 for storing a distance map or distance maps 48 15 calculated by the processing unit 41 in accordance with the present invention.
As explained in the summary part, a distance map of a reference image representing as much as possible stationary objects as represented by the image to be evaluated, contributes as much as possible to the efficiency in 20 generating the distance map of the image.
The processing unit 41 is arranged for evaluation of binary images as disclosed above, in particular with reference to the flow chart diagram of Fig. 2. The distance map 50 of a reference image may be stored in a memory unit 49 and/or 25 may be selected from the binary images fed to the input 43 of the input unit 42, calculated by the processing unit 41 and stored in the memory unit 49. It will be appreciated that a plurality of distance maps for a plurality of reference images may be stored in the memory unit 49 and/or the storage unit 46.
30 For determining whether an image or a sequence of images to be evaluated comprises stationary and non-stationary objects and for using dynamic reference image assignment, an image content comparison and object recognition unit 51 is provided. The processing unit 41 and the content comparison and object recognition unit 51 are arranged to compare an image to be evaluated and a 22 reference image 52, for example stored in the memory unit 49, or a dynamic reference image selected among the images of a sequence of images to be evaluated, for detecting whether content of the images to be evaluated has been changed compared to the reference image. Content that has been changed refers to 5 non-stationary objects.
For the purpose of the invention, object recognition techniques known in practice can be applied, ranging from a very coarse object recognition to a more detailed object recognition including detailed shape information. An example of 10 a coarse object recognition technique is based on intensity levels of an image to be evaluated. Binary images are created in that, for example, picture elements having an intensity value above a set threshold are considered foreground or object picture elements and picture elements having an intensity level below or equal to the threshold are considered background pixels. In the case of color images, a filtering 15 based on spectral ranges may be applied for identifying objects represented by an image.
It will be appreciated that in some cases the speed of evaluation can be even further increased dependent on the shape of an object. A donut shape, for 20 example, may in some cases be considered a single homogeneous and “closed” object.
The event signal may be generated when a distance between objects, i.e. a stationary object and a non-stationary object or between stationary 25 objects, is below a predetermined minimum value. That is, a non-stationary object is too close at or too near at a stationary object or another non-stationary object. To this end, the values of the distance map have to be evaluated following a particular trace, i.e. a trace of picture elements, for example. Algorithms for tracing distances in a distance map are known in practice. In a corresponding manner, an event signal 30 may be generated when a distance exceeds a predetermined maximum value. That is, a non-stationary object is too far away from a stationary object or another non-stationary object. Of course other thresholds or conditions may be set, for which an event setting unit 53 is provided.
23
Although the present invention has been illustrated with reference to exemplary embodiments, those skilled in the art will appreciate that the scope of protection is to be determined by the attached claims and is as such not limited to the embodiments described.
5

Claims (15)

1. Computergestuurde beeld-evaluatiewerkwijze voor het evalueren van een binair beeld dat stationaire en niet-stationaire objecten representeert, welke 5 werkwijze de stappen omvat van het: a) door de computer ophalen van een te evalueren beeld; b) door de computer vergelijken van het te evalueren beeld en een binair referentiebeeld dat de stationaire objecten representeert, en het identificeren van stationaire en niet-stationaire objecten in het te evalueren beeld; 10 c) door de computer bepalen van een maximale waarde in een afstandskaart van het referentiebeeld; d) door de computer, voor elk niet-stationair object, identificeren van een berekeningsgebied dat elk niet-stationair object in het te evalueren beeld omgeeft op basis van de maximale waarde in de afstandskaart van het 15 referentiebeeld; e) door de computer genereren van een deel-afstandskaart voor elk berekeningsgebied; f) door de computer genereren van een afstandskaart van het te evalueren beeld op basis van elke deel-afstandskaart en de afstandskaart van het 20 referentiebeeld; g) door de computer opslaan van de afstandskaart van het te evalueren beeld in een elektronische opslageenheid, en h) genereren van een gebeurtenissignaal op basis van de afstandskaart van het te evalueren beeld.A computer-controlled image evaluation method for evaluating a binary image representing stationary and non-stationary objects, the method comprising the steps of: a) retrieving an image to be evaluated by the computer; b) comparing by computer the image to be evaluated and a binary reference image representing the stationary objects, and identifying stationary and non-stationary objects in the image to be evaluated; C) determining by the computer a maximum value in a distance map of the reference image; d) by the computer, for each non-stationary object, identifying a calculation area that surrounds each non-stationary object in the image to be evaluated based on the maximum value in the distance map of the reference image; e) generating by the computer a partial distance map for each calculation area; f) generating by the computer a distance map of the image to be evaluated based on each partial distance map and the distance map of the reference image; g) storing the distance card of the image to be evaluated in an electronic storage unit by the computer, and h) generating an event signal based on the distance card of the image to be evaluated. 2. Computergestuurde beeld-evaluatiewerkwijze volgens conclusie 1, waarin stap f) omvat het door de computer bijwerken van de afstandskaart van het referentiebeeld met elke deel-afstandskaart.The computer-controlled image evaluation method according to claim 1, wherein step f) comprises updating the distance map of the reference image with each partial distance map by the computer. 3. Computergestuurde beeld-evaluatiewerkwijze volgens een van de voorgaande conclusies, waarin stap c) omvat het door de computer ophalen van de 30 maximale waarde uit een afstandskaart van een referentiebeeld dat alleen en zo veel mogelijk van de stationaire objecten omvat.3. Computer-controlled image evaluation method as claimed in any of the foregoing claims, wherein step c) comprises retrieving the maximum value from a distance map of a reference image comprising only and as many of the stationary objects as possible. 4. Computergestuurde beeld-evaluatiewerkwijze volgens een van de voorgaande conclusies waarin het te evalueren binair beeld een reeks van opeenvolgende binaire beelden omvat. 2005197The computer-controlled image evaluation method according to any of the preceding claims, wherein the binary image to be evaluated comprises a series of consecutive binary images. 2005197 5. Computergestuurde beeld-evaluatiewerkwijze volgens conclusie 4, waarin de computer is ingericht voor het vergelijken van beelden uit de reeks van beelden voor het uit de beelden van de reeks van beelden selecteren van het referentiebeeld.The computer-controlled image evaluation method according to claim 4, wherein the computer is adapted to compare images from the sequence of images to select the reference image from the images of the sequence of images. 6. Computergestuurde beeld-evaluatiewerkwijze volgens conclusie 5, waarin uit de reeks van beelden een beeld dat zoveel mogelijk stationaire objecten van het te evalueren beeld representeert als het referentiebeeld wordt geselecteerd.The computer-controlled image evaluation method according to claim 5, wherein an image representing as many stationary objects of the image to be evaluated as the reference image is selected from the series of images. 7. Computergestuurde beeld-evaluatiewerkwijze volgens een van de voorgaande conclusies, waarin de stationaire en niet-stationaire objecten elk 10 worden omsloten door objectranden en elk berekeningsgebied wordt omsloten door een gebiedsrand, waarbij elk berekeningsgebied alle beeldelementen omvat van het binair beeld tussen randbeeldelementen van elke gebiedsrand en randbeeld-elementen van een corresponderend niet-stationair object, waarin de randbeeldelementen van elke gebiedsrand een minimale metrische afstand tot de randbeeld-15 elementen van het corresponderende niet-stationaire object hebben welke afstand gelijk is aan of groter is dan de maximale waarde in de afstandskaart van het referentiebeeld.7. Computer-controlled image evaluation method as claimed in any of the foregoing claims, wherein the stationary and non-stationary objects are each enclosed by object edges and each calculation area is enclosed by an area edge, each calculation area comprising all pixels of the binary image between edge pixels of each area edge and edge image elements of a corresponding non-stationary object, wherein the edge image elements of each area edge have a minimum metric distance from the edge image elements of the corresponding non-stationary object which distance is equal to or greater than the maximum value in the distance map of the reference image. 8. Computergestuurde beeld-evaluatiewerkwijze volgens een van de voorgaande conclusies waarin de afstandskaart van het te evalueren beeld en het 20 referentiebeeld en de deelafstandskaart een Euclidische afstandsmetriek omvatten.8. Computer-controlled image evaluation method as claimed in any of the foregoing claims, wherein the distance map of the image to be evaluated and the reference image and the partial distance map comprise a Euclidean distance metric. 9. Computergestuurde beeld-evaluatiewerkwijze volgens conclusie 8, waarin de afstandskaart van het te evalueren beeld en het referentiebeeld en de deelafstandskaart kwadraten van de Euclidische afstand omvatten.The computer-controlled image evaluation method according to claim 8, wherein the distance map of the image to be evaluated and the reference image and the partial distance map comprise squares of the Euclidean distance. 10. Computergestuurde beeld-evaluatiewerkwijze volgens een van de 25 voorgaande conclusies, waarin afstanden tussen objecten worden berekend uit de afstandskaart van het te evalueren beeld, waarin het gebeurtenissignaal wordt gegenereerd wanneer een dergelijke afstand onder een vooraf bepaalde minimale waarde ligt.10. Computer-controlled image evaluation method as claimed in any of the foregoing claims, wherein distances between objects are calculated from the distance map of the image to be evaluated, wherein the event signal is generated when such a distance is below a predetermined minimum value. 11. Computergestuurde beeld-evaluatiewerkwijze volgens een van de 30 voorgaande conclusies waarin afstanden tussen objecten worden berekend uit de afstandskaart van het te evalueren beeld, waarin het gebeurtenissignaal wordt gegenereerd wanneer een dergelijke een vooraf bepaalde maximale waarde overschrijdt.11. Computer-controlled image evaluation method as claimed in any of the foregoing claims, wherein distances between objects are calculated from the distance map of the image to be evaluated, wherein the event signal is generated when such exceeds a predetermined maximum value. 12. Binair beeldevaluatie-inrichting, omvattende een binair-beeld- invoereenheid, een binair-beeldverwerkingseenheid, een elektronische opslageenheid, een beeldinhoudvergelijkings- en objectherkenningseenheid en een uitgangseenheid, waarin de beeldverwerkingseenheid en de beeldinhoudvergelijkings- en objectherkenningseenheid zijn ingericht voor het uitvoeren van de 5 werkwijze volgens een van de voorgaande conclusies op een aan de beeldinvoer verschaft binair beeld en voor het in de elektronische opslageenheid opslaan van een afstandskaart van het binair beeld en voor het aan de uitgangseenheid verschaffen van een gebeurtenissignaal.12. Binary image evaluation device, comprising a binary image input unit, a binary image processing unit, an electronic storage unit, an image content comparison and object recognition unit and an output unit, wherein the image processing unit and the image content comparison and object recognition unit are arranged for performing the A method according to any one of the preceding claims on a binary image provided for the image input and for storing a distance card of the binary image in the electronic storage unit and for providing an event signal to the output unit. 13. Evaluatie-inrichting volgens conclusie 12, waarin ten minste één 10 digitale camera met de beeldinvoereenheid is verbonden.13. Evaluation device as claimed in claim 12, wherein at least one digital camera is connected to the image input unit. 14. Evaluatie-inrichting volgens conclusie 12, waarin een veelheid van de eenheden in een geïntegreerde elektronische schakeling zijn gevormd, waaronder een geïntegreerde schakeling op zeer grote schaal (“Very Large Scale Integrated circuit”), VLSI, en een applicatiespecifieke geïntegreerde elektronische 15 schakeling (“Application Specific Integrated electronic Circuit”), ASIC.14. Evaluation device according to claim 12, wherein a plurality of the units are formed in an integrated electronic circuit, including a very large-scale integrated circuit ("Very Large Scale Integrated circuit"), VLSI, and an application-specific integrated electronic circuit (“Application Specific Integrated electronic Circuit”), ASIC. 15. Computerprogrammaproduct, omvattende door een computer leesbare code opgeslagen op een tastbaar, door een computer leesbaar medium, waarin de code, indien geladen in een werkgeheugen van een computer, is ingericht voor het uitvoeren van de computergestuurde beeld-evaluatiewerkwijze volgens een 20 van de conclusies 1-11. 25 2 0 0 5 1 9715. A computer program product comprising computer-readable code stored on a tangible, computer-readable medium, wherein the code, when loaded into a working memory of a computer, is arranged to perform the computer-controlled image evaluation method according to one of the claims 1-11. 25 2 0 0 5 1 97
NL2005197A 2010-08-06 2010-08-06 Computer controlled evaluation of binary images. NL2005197C2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL2005197A NL2005197C2 (en) 2010-08-06 2010-08-06 Computer controlled evaluation of binary images.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2005197A NL2005197C2 (en) 2010-08-06 2010-08-06 Computer controlled evaluation of binary images.
NL2005197 2010-08-06

Publications (1)

Publication Number Publication Date
NL2005197C2 true NL2005197C2 (en) 2012-02-07

Family

ID=44041748

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2005197A NL2005197C2 (en) 2010-08-06 2010-08-06 Computer controlled evaluation of binary images.

Country Status (1)

Country Link
NL (1) NL2005197C2 (en)

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SCHOUTEN, TH.E. ET AL.: "Video Surveillance using distance maps", SPIE REAL-TIME IMAGE PROCESSING, vol. 6063, 16 January 2006 (2006-01-16), San Jose, pages 54 - 63, XP002639764, ISBN: 0819461032, Retrieved from the Internet <URL:http://repository.ubn.ru.nl/handle/2066/35054> [retrieved on 20110531] *
T. E. SCHOUTEN AND E. L. VAN DEN BROEK: "Incremental Distance Transforms (IDT)", 2010 20TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR),, 23 August 2010 (2010-08-23), pages 237 - 240, XP002639763, ISSN: 1051-4651, ISBN: 978-1-4244-7542-1, [retrieved on 20110531] *
T. SCHOUTEN, E. VAN DEN BROEK: "Fast Exact Euclidean Distance (FEED) Transformation", ICPR 2004. PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, 2004., vol. 3, 23 August 2004 (2004-08-23), pages 594 - 97, XP002639765, ISSN: 1051-4651, ISBN: 0-7695-2128-2, Retrieved from the Internet <URL:http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1334599&tag=1> [retrieved on 20100531] *
TH. E SCHOUTEN, H. KUPPENS, AND E. L. VAN DEN BROEK: "Timed fast exact euclidean distance (tfeed) maps", PROCEEDINGS OF REAL-TIME IMAGING IX, vol. 5671, 18 January 2005 (2005-01-18), San Jose California, pages 52 - 63, XP002639762, Retrieved from the Internet <URL:http://spie.org/x648.html?product_id=587784> [retrieved on 20110531] *

Similar Documents

Publication Publication Date Title
US10699476B2 (en) Generating a merged, fused three-dimensional point cloud based on captured images of a scene
US9773302B2 (en) Three-dimensional object model tagging
US9245200B2 (en) Method for detecting a straight line in a digital image
CA2650557C (en) System and method for three-dimensional object reconstruction from two-dimensional images
US9984498B2 (en) Sparse GPU voxelization for 3D surface reconstruction
US20090296984A1 (en) System and Method for Three-Dimensional Object Reconstruction from Two-Dimensional Images
CN110879994A (en) Three-dimensional visual inspection detection method, system and device based on shape attention mechanism
US20140172377A1 (en) Method to reconstruct a surface from oriented 3-d points
US20160163058A1 (en) Geodesic saliency using background priors
JP5972498B2 (en) Edge detection apparatus, edge detection method and program
Miknis et al. Near real-time point cloud processing using the PCL
KR20140027468A (en) Depth measurement quality enhancement
CN111275633A (en) Point cloud denoising method, system and device based on image segmentation and storage medium
CN114119992A (en) Multi-mode three-dimensional target detection method and device based on image and point cloud fusion
CN113945167B (en) Workpiece data acquisition method and device
CN114519681A (en) Automatic calibration method and device, computer readable storage medium and terminal
CN113379815A (en) Three-dimensional reconstruction method and device based on RGB camera and laser sensor and server
Bullinger et al. 3D Surface Reconstruction from Multi-Date Satellite Images
KR101526465B1 (en) A Depth Image Enhancement Method based on GPGPU
KR102587298B1 (en) Real-time omnidirectional stereo matching method using multi-view fisheye lenses and system therefore
NL2005197C2 (en) Computer controlled evaluation of binary images.
CN113126944B (en) Depth map display method, display device, electronic device, and storage medium
Wong et al. 3D object model reconstruction from image sequence based on photometric consistency in volume space
Liu et al. RGB-D depth-map restoration using smooth depth neighborhood supports
Gaddam et al. Advanced Image Processing Using Histogram Equalization and Android Application Implementation

Legal Events

Date Code Title Description
V1 Lapsed because of non-payment of the annual fee

Effective date: 20140301