GB2208194A - Tomographic imaging - Google Patents

Tomographic imaging Download PDF

Info

Publication number
GB2208194A
GB2208194A GB8710495A GB8710495A GB2208194A GB 2208194 A GB2208194 A GB 2208194A GB 8710495 A GB8710495 A GB 8710495A GB 8710495 A GB8710495 A GB 8710495A GB 2208194 A GB2208194 A GB 2208194A
Authority
GB
United Kingdom
Prior art keywords
images
image
signals
scene
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB8710495A
Other versions
GB8710495D0 (en
Inventor
Martin Adrian Newman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rolls Royce PLC
Original Assignee
Rolls Royce PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rolls Royce PLC filed Critical Rolls Royce PLC
Priority to GB8710495A priority Critical patent/GB2208194A/en
Publication of GB8710495D0 publication Critical patent/GB8710495D0/en
Publication of GB2208194A publication Critical patent/GB2208194A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

Tomographic images of an object or scene (A-F) are produced by an analysis of two or more stereographic images of the scene including shifting one image laterally with respect to another and logically summing the image data sets. Several image processing, edge enhancement and edge extraction algorithms may be applied to the images in digitised video data form to provide wire-frame or skeleton type representations of each of the original images. Tomographic images of planes not parallel with the image plane (or normal to the camera axes) may be produced by changing the magnification of one image prior to logical summing (Fig 4). The images may be generated by three video cameras arranged on two orthogonal axes (Fig 6) for elimination of spurious coincidences. The images are preferably produced using X-rays. <IMAGE>

Description

IMAGE TOMOGRAPHY The invention relates to a method of, and apparatus for, image tomography for producing a sectioned view of a three dimensional object or scene.
Non-intrusive techniques are used, for example, in the aero engine industry to examine complete engine assemblies and also to observe in detail the behaviour under running conditions of internal engine components. Prior to about 1970 these techniques were not available but since then x-ray and more recently other forms of penetrating radiation have been used.
Although the techniques have provided valuable insight into the dynamic behaviour of engine components and assemblies, they possess inherent drawbacks in that a thorough understanding of running machinery sometimes requires a three-dimensional section, preferably without recourse to expensive but comprehensive full tomographic methods. The present invention seeks to provide such sections quickly and readily from a minimum number of stereoscopic views. Because the operator cannot reconstruct images with stereoscopic separation greater than eyeball separation, the accuracy of depth measurement by conventional visual comparison methods, that is distance from the camera, is limited.Also, since stereo perception is a brain function and not a viewing process the ability is developed to different extents in different people, and in some not at all which excludes this group from perceiving stereo images.
The invention is intended to solve these difficulties by providing a method of forming a tomographic image without involving a human operator. Although the invention can be used in conjunction with ordinary visible light images, it works best with images formed by penetrating radiation, such as x-rays, which contain edge detail which is otherwise hidden from view. It is also an object of the invention to provide apparatus for carrying out the method. The method is based upon analysis of the edges of the images of objects in the field of view and in particular upon the relative spacing of the edges of images in several picture frames. The calculations involved are simple enough for high speed implementation by current processors thereby making possible real time processing.
According to the invention there is provided a method of image tomography for producing a sectioned view of a three dimensional object or scene comprising the steps of forming at least two stereographic images of the object or scene in which the image planes are coplaner, and identifying coplaner features in the object or scene as viewed by comparing the ratios of the separation distances between the said features in the plurality of images.
Preferably the stereographic images are formed using at least one video camera, and the method further includes steps of digitising the output signal of the or each video camera and processing the video images by operating on the digitised output signals. Other image forming means, such as charge coupled device imagers and line scan imagers may also be employed.
Apparatus suitable for carrying out the above method comprises at least one image forming means arranged to form a plurality of images of an object or scene along a like plurality of sight lines, and means for comparing the separation distances of pairs of features in the images in order to determine which features of the images are coplaner.
Preferably the or each image forming means comprises a video camera in which an image formed in the camera is scanned raster fashion by illumination sensitive means which produces an output signal representative of the level of illumination in the image, and the means for comparing the separation distances is operative to effect a lateral shift of one of each pair of images and to compare subsequently coincident features in the signals.
The invention will now be described in greater detail with reference, by way of example only, to the accompanying drawings, in which: Fig 1 shows in plan view the geometry of stereo image production for five coplaner and one non-coplaner objects to illustrate the principle of the method, Fig 2 shows a corresponding diagram for six objects grouped in three pairs on three planes, Fig 3 illustrates the method of comparing image separation distances to identify coplaner objects, Fig 4 illustrates the method applied to produce section on planes other than orthogonal to the line of sight, Fig 5 shows a schematic arrangement using two x-ray sources, Fig 6 illustrates apparatus for carrying out the invention using three video cameras, Fig 7 shows diagram illustrating the process of image edge extraction, and Fig 8 illustrates the stereo resolution achievable using video cameras.
Referring now to the drawings Fig 1 shows a plan view of the geometry involved in the formation of two stereo images of six objects A,B,C,D,E, and F, five of which all lie in the same plane perpendicular to the line of sight of cameras 1 and 2, with a sixth object F, occupying a separate plane spaced apart from the first plane containing the five objects A-E. The focal planes of the cameras 1 and 2 are coplaner. The images formed of the objects have references A1- F1 in camera 1 and A2 - F2 in camera 2. Light rays forming the images are shown as straight lines intersecting at a point on the optical axis of the camera at the position of the aperture stop of each camera.
Consideration of the optical geometry of Fig 1 shows that the ratios of the separation distances of the images of the objects A-E are preserved in the image planes of both cameras. That is, the ratios A1,B1/Bl,C1 is equal to the ratio A2,B2/B2,C2, and so on for the images of each pair of objects occupying the first object plane.
However, it will be observed that for the object F in the second object plane, although its image in both cameras lies between the images of objects C and D, in camera 1 the image F1 lies closer to C1 than it does to D1, whereas in camera 2 the image F2 lies closer to D2 than it does to C2. Thus, the constant proportionality of the ratios of separation distances does not hold for images in different object planes.
The present invention seeks to exploit this phenomena in order to pick out the images of those objects which are coplaner.
Referring now to Fig 2, this shows a plan view, corresponding to view of Fig 1, for six objects grouped in pairs on three different object planes.
It will be seen that whereas the distance between any two objects is the same in both cameras the relative spacing of images of different planes is different in the two cameras.
The principle of the correlation technique for picking out the images of objects in the same object plane is illustrated in Fig 3. At (a) the images II and I2 from cameras 1 and 2 respectively are shown one above the other. As a result of parallax the images are not the same and in particular the relative spacings of the images of the objects A-F are not the same in both images.
Thus, the result of a straightforward comparison of the two images, by applying a logical "AND" function, is the blank composite image, shown immediately below the pair of images 11 and I2 in Fig 3 (a). In practice, because the images contain a large amount of detail some spurious correlation is inevitable and anomolous points occur in the composite image. This is largely eliminated according to a further aspect of the invention by forming a third image which constktutes a second stereo pair with one of the other two images, but in which the parallax error is due to displacement in a direction perpendicular to the displacement direction of the first pair of images. This arrangement will be described in greater detail later with reference to Fig 7.
In the second comparison stage, illustrated by Fig 3 (b) 11, the upper of the two images, is shifted horizontally to the right (in the drawing) until coincidence is found between the images of points E and C in the two images Il and I2. This step effectively eliminates the parallax error in the two images for those two points. Then the logical "AND" function is again applied to the images to yield a composite image comprising only those points which in the drawing are found one above the other. This time the composite image contains, basically, only images of the points E and C, all other image detail having been suppressed.This the effect of the relative lateral shift of the images I1 and I2 to find a coincidence of image points has effectively sliced the image in a plane passing through the selected points. That is, the composite or resulting image constitutes a tomographic section of the original object or scene.
A still further and third comparison stage is illustrated at Fig 3 (c) in which the upper image I1 has been shifted further in order to eliminate the parallax error between the images of points F and A this time. The result of the logical "AND" function now yields a composite image which represents a tomographic section through the original object or scene containing the points F and A.
Alternatively, if the scale of the images is known and a ground datum or reference established in each then tomographic sections can be formed in selected planes by shifting the images relatively by predetermined amounts.
In the examples considered so far the tomographic sections have been defined by planes lying perpendicular to the lines of sight of both image forming means, ie the axes of both cameras.
Sections can be formed for other planes displaced angularly with respect to the camera axes as illustrated in Fig 4. As before the focal planes of the image forming means, or the image planes if only one camera is used, are arranged to be coplaner, perpendicular to both lines of sight and displaced laterally. To form a tomographic section in a plane which subtends an oblique angle with respect to the lines of sight in the plane of Fig 4 one of the images is "stretched" or magnified relative to the other. In Fig 4 the right hand image I2 is enlarged, at least in the transverse direction, with respect to the left hand image I1.
Relative enlargement in only one direction contained in the plane of the drawing will yield a section in a plane which is angled obliquely with respect tc the lines of sight but which is still perpendicular to the plane of the drawing.
The amount of relative enlargement or stretch required can be determined from test exposures by comparing the separation distance ratios of object of known relative spacing. Also, sections on different planes can be formed by altering the magnification of the said magnificant image I2 relative to the the other image 11.
It has been mentioned earlier that the present invention is used to best advantage by employing images which contain information about all edges of a viewed article. Images made of solid objects using visible light frequently do not contain any information relating to hidden edges or sides, thus, tomographic sections through the "far side" of solid object or through obscured objects cannot be constructed from visible light images. Images made of x-ray radiation, however, contain such information and can be used for the purposes of the invention. Other forms of radiation which provide the same or like information may also be used in carrying out the invention.
Figure 5 shows an arrangement for forming images of a static object having features A, B, C and D which uses two x-ray sources X1 and X2. The two exposures I1 and I2 may be made simultaneously with two separate sources or they may be made sequentially in which case a single x-ray source may be used from first one position and then the other.
Referring now the the apparatus depicted in Fig 6, which is intended to overcome the problem of anomolous results arising from accidental or spurious coincidences in the images, three image forming means are arranged spaced apart along two perpendicular axes which are also perpendicular to the line of sight of each of said means. The image forming means comprise three x-ray video cameras 12, 14, 16 spaced apart along X and Y axes and having lines of sight parallel to the Z axis. The object or scene viewed by the cameras is not shown.
The cameras may view a three-dimensional object directly or may view radiographs of the object exposed using x-ray radiation.
The cameras 12, 14, 16 operate in conventional fashion by scanning taster fashion the image formed in the focal plane of the camera and producing output signals 18,20,22 respectively representing the instantaneous level of illumination of the image. The said output signals comprise an analogue voltage continuously variable in the range 0-0.7 volts. Each camera contain an analogue to digital converter 24,26,28 having a digital output range 0-255 corresponding to the analogue input range.A typical digitized video picture frame comprises 512 x 512 individual picture elements, called pixels, arranged in rows and columns. the digitized picture signals generated in each camera is stored in a digital memory or "framestore" 30, 32, 34. these digitally stored images can be displayed on a TV monitor by reading the stored values sequentially to reconstruct the scan signal, converting the digital values back to analogue signals and using the reconstituted signal as video input to a monitor. At this stage no processing of the signal has been undertaken.
Before processing the provide the final tomogram image the video signals in each channel may be subjected to several intermediate stages of processing, indicated at 36, 38, 40 by function blocks labelled grey scale pre-processing. First, the signals may pass through one or more grey scale pre-processing stages for the purpose of matching the brightness levels and distribution in the different images. Shading correction may be carried out to suppress surface detail such as roughness so areas of one brightness level are represented as the same figure on a grey scale.
Often it is found that the various levels of grey, ie contrast, in an image are not evenly distributed between black and white but concentrated into a relatively narrow band in the grey scale. A contrast stretching function may be applied to rectify this so that contrast levels are spread across the whole grey scale. The images can then be matched by equating the grey levels of corresponding areas of the different images, for example in the background, by cancelling out level differences. These functions are known, per se, in the prior art and appropriate software routines for execution by computer based image processors are well known.
Next, the video signals pass through blocks 42, 44, 46 where edges in the images are enhanced by application of a processing function which has the effect of delineating step changes of illumination, ie changes between different grey levels, in the images by switching the image pixel or pixels in the region of the change to one end or the other of the grey scale, ie switching it to black or white.
After this the edges are effectively extracted from the image by suppressing all other grey levels to leave only the previously highlighted edges. Each of the images then comprises a skeleton image of, say, black against a mid-grey background, having only binary levels of brightness.
Further intermediate processing may still be required, however, in order to reduce or obviate uncertainties and ambiguities in the final images, this is indicated by the binary processing blocks 48, 50, 52. In binary processing further operations may beperformed with the intention of improving the clarity and precision of an image.
An edge shrinking or erosion function may be applied to reduce the thickness of enhanced edges down to one pixel width, for example. Suppose an edge varies in thickness along its length and is several pixels wide, then a function may be applied to determine the mean position of the edge and to set the value of the corresponding or nearest pixel only to the appropriate binary level, suppressing the others. Also at this stage breaks or discontinuities in edges may be rectified by means of a further binary delineation function which effectively redraws missing portions of edges.
Fig 7 (a) shows a video signal in the region of an edge formed by a step between two surfaces, or by the edge of a plane object viewed against a background of relatively lower illumination. The brightness of the object is proportional to the signal level on the vertical axis, the boundary between pixels is indicated and the equivalent digital representative of the signal brightness level for each pixel is given along the lower edge of the diagram. Thus it will be seen that the signal representing brightness level changes from a relatively higher level '10' to a relatively lower level '4'. The signal transition region which represents the edge of the object occupies the width of two pixels, the digital signal levels of which are '8' and '6'.
In the edge extraction process the constant brightness level signals and the slowly changing signals are suppressed to a uniform level, but in the transition region the gradients of the signal levels are taken. In the second process step as is shown in Fig 7 (b) the partly processed "edge" signal is compared with a signal threshold level.
Signal levels greater than the threshold level are boosted to the higher of two binary signal levels and signals below the threshold are reduced to the lower of the two binary levels. At this stage the edge is three pixels wide. The edge image may be subsequently shrunk to the value of the central pixel only, the two outer pixels being set to zero.
Thus, edge signals are enhanced and the resulting image contains only binary values of black and white. The fully processed digital video signals are stored in enhanced form. The binary signal values are suitable for use directly as inputs, for example, for logical operations such as "AND" and "OR" functions.
Each of the images formed by the image forming means ie the cameras 12, 14, 16 in Fig 7, is processed and stored in the manner just described.
The reconstruction of tomographic sections of the image, using the technique described above with reference to Fig 3, can now proceed using the stored enhanced signal values. The lateral shift of one of the images relative to the other(s) is accomplished in blocks 54, 56 by effectively offsetting the memory addresses of the row signals of one of the images. The shift value may be set at any desired value, subject to the effect this has in reducing the overlap width of the images.
Where it is required to reconstruct a plurality of tomographic sections the shift value may be increased by a suitable value before commencing each logical "AND" processing step through AND gates 58, 60, 62. The output of gate 62 comprises in serial format binary brightness values of a scanned tomographic section also in the form of a skeleton or wire frame model. A series of sections can be constructed, therefore, by incrementing the shift value in steps, each increment being made before commencing a fresh comparison stage.
The image enhancement, edge extraction, anc the shift and comparison steps described above are very suitable for performance by electronic logic means and in particular by computers.
Fig 8 illustrates that the stereo resolution achievable in the video imaging systems described above is a function of pixel size, or image resolution. Fig 8 (a) shows a geometrical consideration of the optical system wherein the minimum stereo distance which can be resolved is "z" and the minimum image size which can be resolved is "x". In a video imaging system the minimum value of "x" is, of course, the size of one pixel, "a" is half the separation distance between optical axes of the imaging means, "b" represents the lens to object distance, and "c" is the lens to image plane distance. The derivation of the relationship between "X" and "Z" from a consideratIon of the geometrical relationship of the said distances is shown in equations (i) and (ii) and the final form of the function is given at (iii). It is to be noted that although the stereo resolution is governed, inter alia, by camera magnification, ie distance "c", and by pixel size "x", it is most sensitive to the camera object distance being proportional to "b2,1.

Claims (19)

1. A method of image tomography for producing a sectioned view of a three dimensional object or scene comprises the steps of: forming at least two stereographic images of the object or scene in which the image planes are co-planer and, identifying co-planer features in the object or scene as viewed by comparing the ratios of the separation distances between the said features in the plurality of images.
2. A method as claimed in claim 1 includes forming three or more stereographic images.
3. A method as claimed in claim 1 or claim 2 wherein the viewing axes of said images are mutually parallel.
4. A method as claimed in any one of claims 1 to 3 wherein the images are formed substantially simultaneously by separate imaging means.
5. A method as claimed in any one of claims 1 to 3 further comprises the step of forming the images sequentially,
6. A method as claimed in claim 5 further comprises the step of shifting the object between forming the said images.
7. A method as claimed in any preceding claim wherein the said stereographic images are formed by x-ray imaging.
8. A method as claimed in claim 7 wherein each of the images is formed by radiation from a different source.
9. A method as claimed in any preceding claim wherein the images are formed using at least one video camera.
10. A method as claimed in claim 9 including the steps of digitising an output signal from the or each video camera and processing the video images by operating the digitised output signal.
11. A method as claimed in claim 10 including the step of applying a threshold function to the said video output signal in order to enhance the definition of edges in the image.
12. A method as claimed in claim 11 including the step of processing the video image in order to provide a binary output signal.
13. A method substantially as hereinbefore described with reference to the accompanying drawings.
14. Apparatus for image tomography comprising at least one image forming means arranged to form a plurality of images of an object or scene along a like plurality of sight lines, and means for comparing the separation distances of pairs of features in the images in order to determine which features of the images are co-planer.
15. Apparatus as claimed in claim 14 wherein the or each image forming means comprises a video camera in which an image formed in the camera is scanned raster fashion by illumination sensitive means which produces an output signal representative of the level of illumination in the image, and the means for comparing the separation distances is operative to effect a lateral shift of one of each pair of images and to compare subsequently coincident features in the signals.
16. Apparatus as claimed in claim 15 wherein the or each pair of signals for comparison are combined by a two input AND function gating means.
17. Apparatus as claimed in claim 16 comprising three image forming means spaced apart on two orthogonal axes mutually perpendicular to the direction of sight, the pair of signals for comparison being taken from the pairs of image forming means lying on said orthogonal axes, one signal of each said pairs of signals being shifted relative to the signal common to both pairs by means effective to introduce a lateral image shift, first and second AND gating means for combining the signals of each pair and a third AND gating means for combining the outputs of the first and second AND gating means.
18. A method of image tomography substantially as described with reference to the accompanying drawings.
19. Apparatus for image tomography substantially as described with reference to the accompanying drawings.
GB8710495A 1987-05-02 1987-05-02 Tomographic imaging Withdrawn GB2208194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB8710495A GB2208194A (en) 1987-05-02 1987-05-02 Tomographic imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB8710495A GB2208194A (en) 1987-05-02 1987-05-02 Tomographic imaging

Publications (2)

Publication Number Publication Date
GB8710495D0 GB8710495D0 (en) 1987-06-03
GB2208194A true GB2208194A (en) 1989-03-08

Family

ID=10616775

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8710495A Withdrawn GB2208194A (en) 1987-05-02 1987-05-02 Tomographic imaging

Country Status (1)

Country Link
GB (1) GB2208194A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2647997A1 (en) * 1989-06-02 1990-12-07 Thomson Csf Process and device for the three-dimensional reconstruction of objects from images of projections taken with any angle of incidence
AT414194B (en) * 2002-03-06 2006-10-15 Christian Dipl Ing Neugebauer DEVICE AND METHOD FOR CALIBRATING A STEREO IMAGE PROCESSING SYSTEM

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3818220A (en) * 1971-11-03 1974-06-18 A Richards Variable depth laminagraphy
GB1415314A (en) * 1971-11-26 1975-11-26 Cfc Products Film cutter for dynamic tomography
GB2016855A (en) * 1978-03-11 1979-09-26 Philips Nv Imaging of a sectional layer of a three-dimensional object
US4191890A (en) * 1976-12-03 1980-03-04 N.V. Optische Industrie "De Oude Delft" Synthetic aperture scanner for decoding a coded image produced by penetrating radiation, such as X-rays
GB1569708A (en) * 1975-08-08 1980-06-18 Philips Electronic Associated Method of recording and subsequently recovering image information by encoding and decoding a compsite of perspective images
GB2046468A (en) * 1979-03-23 1980-11-12 Philips Nv Method and apparatus for making a planigram of a three-dimensional object
GB2062403A (en) * 1979-11-08 1981-05-20 Philips Nv Device for reducing faults in layer images of a three-dimension object formed by meatns of penetrating radiation
GB2063505A (en) * 1979-10-30 1981-06-03 Philips Nv Tomosynthesis apparatus including a lens matrix
EP0034862A1 (en) * 1980-02-23 1981-09-02 Philips Patentverwaltung GmbH Device for producing tomographical images of a three-dimensional object using superposed zonograms
EP0182429A2 (en) * 1984-11-22 1986-05-28 Philips Patentverwaltung GmbH Method and arrangement for producing tomographical images of an object

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3818220A (en) * 1971-11-03 1974-06-18 A Richards Variable depth laminagraphy
GB1415314A (en) * 1971-11-26 1975-11-26 Cfc Products Film cutter for dynamic tomography
GB1569708A (en) * 1975-08-08 1980-06-18 Philips Electronic Associated Method of recording and subsequently recovering image information by encoding and decoding a compsite of perspective images
US4191890A (en) * 1976-12-03 1980-03-04 N.V. Optische Industrie "De Oude Delft" Synthetic aperture scanner for decoding a coded image produced by penetrating radiation, such as X-rays
GB2016855A (en) * 1978-03-11 1979-09-26 Philips Nv Imaging of a sectional layer of a three-dimensional object
GB2046468A (en) * 1979-03-23 1980-11-12 Philips Nv Method and apparatus for making a planigram of a three-dimensional object
GB2063505A (en) * 1979-10-30 1981-06-03 Philips Nv Tomosynthesis apparatus including a lens matrix
GB2062403A (en) * 1979-11-08 1981-05-20 Philips Nv Device for reducing faults in layer images of a three-dimension object formed by meatns of penetrating radiation
EP0034862A1 (en) * 1980-02-23 1981-09-02 Philips Patentverwaltung GmbH Device for producing tomographical images of a three-dimensional object using superposed zonograms
EP0182429A2 (en) * 1984-11-22 1986-05-28 Philips Patentverwaltung GmbH Method and arrangement for producing tomographical images of an object

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2647997A1 (en) * 1989-06-02 1990-12-07 Thomson Csf Process and device for the three-dimensional reconstruction of objects from images of projections taken with any angle of incidence
AT414194B (en) * 2002-03-06 2006-10-15 Christian Dipl Ing Neugebauer DEVICE AND METHOD FOR CALIBRATING A STEREO IMAGE PROCESSING SYSTEM

Also Published As

Publication number Publication date
GB8710495D0 (en) 1987-06-03

Similar Documents

Publication Publication Date Title
DE69601880T2 (en) METHOD AND DEVICE FOR CREATING THE POSITION OF A TELEVISION CAMERA FOR USE IN A VIRTUAL STUDIO
DE60310226T2 (en) METHOD FOR LINEAR ROOM SENSING AND DEVICE FOR GENERATING A NUMERICAL 3D MODEL
Sutherland et al. A characterization of ten hidden-surface algorithms
CA1315902C (en) Minimization of directed points generated in three-dimensional dividing cubes method
US6701005B1 (en) Method and apparatus for three-dimensional object segmentation
US8264546B2 (en) Image processing system for estimating camera parameters
DE69521739T2 (en) Image processing method and apparatus
US6205241B1 (en) Compression of stereoscopic images
US4558359A (en) Anaglyphic stereoscopic image apparatus and method
US20100134516A1 (en) Image processing system
US20100134688A1 (en) Image processing system
JP2004312745A (en) Composite camera and method for achieving automatic focusing, depth of field, and high resolution function
EP0758515A4 (en)
CN111028295A (en) 3D imaging method based on coded structured light and dual purposes
GB2315124A (en) Real time tracking of camera pose
DE112011103452T5 (en) Method for matching pixels of a distance representation
JPH0528243A (en) Image-forming device
EP0918302B1 (en) Coherence detector
Luo et al. Using surface model to correct and fit disparity data in stereo vision
Frobin et al. Automatic Measurement of body surfaces using rasterstereograph
GB2208194A (en) Tomographic imaging
CN107103620B (en) Depth extraction method of multi-optical coding camera based on spatial sampling under independent camera view angle
JP2004310777A (en) Combination camera and method of compositing virtual image from two or more inputted images
US5764788A (en) Strand orientation sensing
JP2860992B2 (en) Moving target extraction method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)