WO1997036147A1 - Methods and apparatus for using image data to determine camera location and orientation - Google Patents
Methods and apparatus for using image data to determine camera location and orientation Download PDFInfo
- Publication number
- WO1997036147A1 WO1997036147A1 PCT/IB1996/000424 IB9600424W WO9736147A1 WO 1997036147 A1 WO1997036147 A1 WO 1997036147A1 IB 9600424 W IB9600424 W IB 9600424W WO 9736147 A1 WO9736147 A1 WO 9736147A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- points
- point
- images
- location
- camera
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000003287 optical effect Effects 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims abstract description 9
- 238000005304 joining Methods 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims description 2
- 238000005259 measurement Methods 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 15
- 238000012937 correction Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 7
- 238000004458 analytical method Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000011835 investigation Methods 0.000 description 5
- 238000010276 construction Methods 0.000 description 3
- 238000012804 iterative process Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000001454 recorded image Methods 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 241000294399 Scrophularia marilandica Species 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000002271 resection Methods 0.000 description 2
- 238000010420 art technique Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000011842 forensic investigation Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the invention relates to the field of image processing and more particularly to methods and apparatus for determining camera position and orientation from an image captured with that camera and to accurate surveying using such methods and apparatus.
- Stereoscopic photographic cameras which utilize a single camera body and two objective lenses separated by a fixed distance, usually corresponding to the interocular distance.
- Other such cameras use a single objective and external arrangements which form two image areas on film positioned on the camera's image plane.
- Still other arrangements use two separate cameras separated by a fixed distance to form images corresponding to a left and right eye view of the scene being photographed.
- stereoscopic photographic images of the prior art are developed, they are often viewed through separate eye pieces, one for each eye. Each eye piece projects a view of a respective one of the developed images which the user's eyes would have seen had the eyes viewed the scene directly. Depth is clearly discernable when viewing a stereoscopic image.
- Calculations of depth is a difficult task when using images captured from different positions vis-a-vis the scene being photographed because the planar relationships which result from projection of a three dimensional scene onto a two dimensional plane do not undergo a linear transformation or mapping compared with the same points projected onto a different image plane.
- Different portions of a scene viewed from one point relate differently to corresponding points from the same scene viewed from another point.
- Planar surfaces which are viewed normally in one view are reduced in extent when viewed obliquely.
- Aerial surveying is also known. Images are captured from an airplane or other vehicle in transit over an area to be surveyed at positions which are precisely known by modern navigation techniques. Position of significant ground features can then be calculated using sophisticated image processing techniques which often require supercomputers. Aerial surveying techniques have the advantage that they can be accomplished without the need to place people on the ground in the area to be surveyed. Inaccessible terrain can also be surveyed in this way. However, expensive image capture equipment is required and even with very good optics and image processing, the resolution is not always as good as one might like. Also, accurate measurements in the vertical direction are even more difficult to take using aerial techniques.
- 3-D representation such as a wireframe
- CAD/CAM computer assisted design or computer assisted manufacturing
- Every recorded image whether it be a photograph, a video frame, a true perspective drawing or other form of recorded image, has associated with it a viewing location and viewing look angles that exactly describe the orientation of the recording mechanism relative to the recorded scene.
- camera location was either estimated or known a priori by locating the position from which the picture was taken using surveying techniques.
- rotation angle was assumed to be 0 (horizontal) and elevation and azimuth were either measured with varying degrees of accuracy or estimated.
- surveying and measurement increase the set up time required before capturing images for analysis, often to the point where any hope of accurate measurements would be abandoned in favor of qualitative information which could be gleaned from images captured under uncontrolled conditions.
- stereo photographs are frequently used to investigate and document accident or crime scenes.
- the accuracy of the documentation depends to a high degree on knowing exactly the viewing parameters of the cameras at the time the photographs were taken.
- Computer-generated renderings are often merged with actual photographs to convey an image of a completed construction project while still in the planning and review stages.
- the viewing parameters of the computer rendering it is necessary for the viewing parameters of the computer rendering to be exactly the same as the viewing parameters of the camera that took the photograph.
- the viewing parameters for any given recorded image are unknown and difficult to determine with a high degree of accuracy, even when the camera positions are physically measured relative to some established coordinate system.
- the difficulties arise from the fact that the camera lens principle points are usually located inside the lens structure and therefore inaccessible for purposes of direct measurement.
- the measurement of viewing angles is even more difficult to accomplish without the use of surveying type tripods, levels and transits.
- Photogrammetry is a science that deals with measurements made from photographs. Generally, photogrammetrists use special camera equipment that generates fiducial marks on the photographs to assist in determining the viewing parameters. Non-photogrammetric cameras can be used in some analyses, however the associated techniques generally require knowing the locations of a large number of calibration points (five or more) that are identifiable in the recorded scene. Generally, the three-dimensional location of five or more calibration points need to be known in terms of some orthogonal, reference coordinate system, in order to determine the viewing parameters.
- the Direct Linear Transform (DLT) is a five-point calibration procedure that is sometimes employed by photogrammitrists.
- the Church resection model may be used when the optical axis of an aerial camera lens is within four or five degrees of looking vertically down on the terrain. Angular displacements from the vertical of more than a few degrees results in noticeable mathematical nonlinearities that are associated with transcendental trigonometric functions. Under these conditions, the Church resection model is no longer valid and the three-point calibration procedure no longer applies.
- the problems of the prior art are overcome in accordance with the invention by automatically identifying camera location and orientation based on image content. This can be done either by placing a calibrated target within the field of the camera or by measuring the distances among three relatively permanent points in the scene of images previously captured. Using the points, the location and orientation of a camera at the time a picture was take can be precisely identified for each picture. Once the location and orientation of the camera are known precisely for each of two or more pictures, accurate 3-D positional information can be calculated for all other identifiable points on the images, thus permitting an accurate survey of the scene or object.
- the images can be captured by a single camera and then used to generate stereo images or stereo wireframes.
- the above and other objects and advantages of the invention are achieved by providing a method of measuring the absolute three dimensional location of points, such as point D of Figerie 1 with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data.
- the image data is captured by using one or more cameras of known focal length to capture two images of a scene containing the points A, B, C and D.
- the location and orientation of the camera (s) at the time each of said images was captured is determined with reference to said coordinate system by using information derived from said images, the known focal length and the known distances,
- the locations of the cameras at the time the images were captured is then utilized with other image data, to determine the location of points such as point D.
- the step of using the locations of the cameras at the time the images were captured to determine the location of said point D from image data includes defining an auxiliary coordinate system with origin along the line joining the locations of the cameras, defining the center point of each image as an origin of a set of image reference axes pointing in X', Y' and Z' directions, respectively, measuring offset in at least one of the X' and Y' directions of a point on the first image and of a corresponding point of a second image, determining the angles formed between a line joining point D, the focal point of the objective and the image of point D on one of the X' or Y' planes for each of the images, determining said distance h using the measured offsets, the focal length and the angles, determining the X' and Y' coordinates of point D in the auxiliary coordinate system, and transforming coordinates (X', Y', h) of the auxiliary coordinate system to a representation in said coordinate system defined using said three points, A, B and C.
- the step of determining the location and orientation of said one or more cameras at the time said images were captured with reference to said coordinate system using image data, known focal length and said known distances includes representing the distance between point A, B and C and the focal point of a camera 0 as a viewing pyramid, modifying the representation of the pyramid to a joined three triangle flattened representation, selecting a low estimate Ob 1 for one interior side of a first triangle of said flattened representation, solving the first triangle using image data, known focal length and said known distances, yielding, inter alia, a first calculated value for length OA, given estimate Ob 1 ,
- the process of deriving values for camera location using distances OA, OB and OC comprises solving simultaneously equations for spheres centered at points A, B and C with respective radii of OA, OB and OC.
- one calculates the azimuthal and elevational adjustment required to direct the camera to the location of point A and calculates the amount of rotation about the optical axis required to align point B once the camera points at point A. This is done interactively until the degree of alignment is within the desired degree of accuracy.
- the invention can be used to measure the distance between two points especially in a vertical direction, to locate the physical position of objects visible in images accurately, to create a three dimensional wireframe representation and to document the "as built" condition of an object.
- the invention is also directed to a method of measuring the absolute three dimensional location 0 of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data by capturing an image of a scene containing the points A, B, and C, using a camera, determining or knowing a priori the focal length of said camera, determining the location of said camera at the time said image was captured with reference to said coordinate system using information derived from said image, known focal length and said known distances.
- the invention is also directed to a method of measuring distance including vertical height by measuring the absolute three dimensional location of points D, E and F with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data using techniques described above, by determining distances between points D, E and F, and by using the location of said points D, E and F and the location of cameras at the time images were captured to determine the location of other points.
- the other points may be optionally located on images different from those used to determine the location of points D, E and F.
- the invention is also directed to apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data
- apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data including one or more cameras for capturing images of a scene containing the points A, B, C and D, a memory interfaced to the camera (s) for storing images captured by the camera (s), a computer for processing stored images to determine the location and orientation of the camera (s) at the time each of said images was captured with reference to said coordinate system, using information derived from said images, known focal length and said known distances, and for using the locations of said one or more cameras at the time the images were captured to determine the location of said point D from image data.
- Location information can be stored in a database which can be used for different purposes. For example, it can be used to store a three dimensional wireframe representation or
- Figure 1 is an illustration of the capture of two images of a scene, including a building, according to the invention.
- Figure 2 is an illustration of a viewing pyramid of three calibration points as projected through the focal point of a camera.
- Figure 3 is an illustration of a flattened pyramid used for calculation of camera distance.
- Figure 4 is an illustration of viewing angle determination used in calculation of camera distance.
- Figure 5 is an illustration of near, mid and far ambiguity.
- Figure 6 is an illustration of how to resolve near, mid and far ambiguity.
- Figure 7 is an illustration of azimuthal and elevational correction.
- Figure 8 is a flow chart of the algorithm used to determine camera distance and orientation.
- Figure 9 is a flow chart of the algorithm used to calculate camera location.
- Figure 10 is an illustration of how to calculate the distance of a point from a line joining the principal points of two cameras.
- Figure 11 is an illustration of the calculation of the location of a point in the X direction.
- Figure 12 is an illustration of the calculation of the location of a point in the Y direction.
- Figure 13 is an illustration of how to calculate point location generally given a determination of the location and orientation of the camera at the time when two images were captured.
- FIG. 14 is an illustration of hardware utilized in accordance with the invention.
- Figure 1 illustrates a building 100 in front of which is located a calibrated target such as a builder's square 110.
- Pictures of the building are taken from two positions. The first from point f 1 and the second from point f 2 .
- f 1 is the location of the principal point of the lens or lens system of a camera and the image projected through that point falls on image plane fp 1
- a second image of the scene is captured from position f 2 and the image through principal point f 2 is cast upon image plane fp 2 .
- the positioning of the cameras is arbitrary. In some circumstances, it is desirable to capture images from two locations using the same camera. In other circumstances, it may be desirable to capture the images using different cameras.
- the camera is aimed so as to center the object of interest within the viewing frame.
- both cameras are pointed at center point
- a real world coordinate system is defined with the Y axis running through points A and C and an X axis defined perpendicular to the Y axis through point A in the plane of A, B and C, thus forming an origin O at point A.
- a Z axis is defined perpendicular to the XY plane and running through point A.
- principal point f 1 is located at (X 1 , Y 1 , Z 1 ).
- principal point f 2 is located at (X 2 , Y 2 , Z 2 ).
- a camera directed at target point T has both an azimuth and an elevation which can be specified utilizing the coordinate system.
- the camera may be rotated about the optical axis of the camera differently when the two pictures were taken. In short, there is no guarantee that the camera was horizontal to the XY plane when the picture was taken and thus, the orientation of the images may require correction prior to processing.
- Figure 2 illustrates a viewing pyramid formed by the three points A, B and C vis-a-vis the origin O (the principal point of a camera).
- the viewing pyramid can be viewed as having three surfaces, each corresponding to a surface triangle, namely, triangles AOB, BOC and COA. If one were to view the pyramid shown in Figure 2 as hollow and made of paper and if one were to cut along the line OA and flatten the resulting pattern, one would achieve a flattened pyramid such as shown in Figure 3.
- the distance OA represents the distance from point A which is at the origin of the coordinate system to point O which is at the principal point of the lens.
- angles AOB, AOC and BOC are known by virtue of knowing the distance between the principal point and the image plane and the measured distance separating two points on the image plane.
- Figure 4 assists in illustrating how this is done.
- the XY plane constitutes the image plane of the camera.
- f 0 is the principal point of the lens.
- Images of points A and B are formed on the image plane after passing through the principal point at locations A and B shown on the XY plane.
- the incoming rays from points A and B are respectively shown at 400 and 410 of Figure 4.
- an image plane origin FP 0 is defined and an X axis is defined as parallel to the longest dimension of the image aspect ratio.
- the Y axis is formed perpendicular thereto, and the origin FP 0 lies directly under the principal point.
- Rays from points A and B form an angle alpha ( ⁇ ) as they pass through the focal point.
- the projection of those rays beyond the focal point also diverge at ⁇ .
- ⁇ corresponds to ⁇ AOB of Figure 3.
- the distances AFP 0 and BFP 0 can be determined.
- the angles separating points A, B and C can be determined in the manner just described.
- the distances separating points A, B and C are also known, either a priori by placing a calibrated target, such as a carpenter's square in the scene being photographed, or by measuring the distances between three relatively permanent points in the scene previously captured after the images have been formed.
- the distance OA represents the distance from the principal point of the camera (O) to point A which is the origin of the coordinate system utilized to define camera position. At a high level, this is done by first assuming a very low estimate for the distance OB, such as the distance Ob 1 , then with that assumption, triangle AOB is solved. "Solving a triangle” means establishing (e.g. calculating) values for the length of each side and for each of the angles within the triangle. With the distance Ob 1 assumed, the first triangle is solved using known, assumed or calculated values. In the process, a value for distance OA is calculated. Using the estimate Ob 1 , the second triangle BOC is solved and the derived distance OC is then utilized to solve the third triangle COA.
- the calculated value for OA of the third triangle is compared with the calculated value of OA of the first triangle and the estimate Ob 1 is revised by adding the difference between the values for OA from the third triangle and the value for OA from the first triangle to the estimate Ob 1 and the process is repeated.
- the estimate Ob 1 will be improved until the difference between the calculated values of OA reduces to a value less than ⁇ .
- ⁇ is low enough for the accuracy needed, the iterations cease and the true value of OA is assumed to lie between the values calculated for the first and third triangles.
- Distance Ob 1 is the estimate of the length of OB, which, at the outset, is set to be low.
- the distance AB is known because the dimensions of a calibrated target are known or because the distance AB has been measured after the images are captured.
- the value for ⁇ AOB is calculated from measurements from the image plane as illustrated in Figure 4 and discussed in connection with equations 1-7. Therefore, ⁇ OAB can be calculated as follows:
- the first estimate of ⁇ OBA can be calculated as follows:
- Ob 1 is assumed to be the distance OB.
- Distance BC is known from the target or measurements and angle BOC is known from measurements from the image plane.
- the third triangle can be solved in a manner completely analogously to the solution of the second triangle substituting in the corresponding lengths and angles of the third triangle in equations 8-12.
- the distance OA which has been calculated as set forth above.
- This distance OA from the third triangle will have been derived based on calculations from the first, second and third triangles. Note, however, that the distance OA from the third triangle and the distance OA from the first triangle should be identical if the assumed value Ob 1 were equal in fact to the real length OB. Since Ob 1 was initially assumed to be of very low value, there will be generally a difference between the value of OA from the third triangle as compared with that from the first triangle. The difference between the two calculated lengths is added to original estimate Ob 1 to form an estimate Ob 2 for the second iteration.
- the estimate for the distance OB can be made accurate to whatever degree of resolution one desires by continuing the iterative process until the difference between OA from the first triangle and that from the third triangle is reduced to an acceptable level, ⁇ .
- the distance OA which results from the iterative process is then equal to the distance of the principal point of the camera shown at 0 in Figure 3 to point A which is the origin of the coordinate system defined for this set of measurements.
- FIG 5 when viewing the points A, B and C from the principal point of the camera, one cannot necessarily determine which of points A, B and C are closest and next closest to the camera. For example, in Figure 5, given that point B 1 is closest to the camera, it is possible that either point A is closer and point C farther, or alternatively, that point C is closer and point A farther. These differences are reflected in triangles A 1 B 1 C 1 as compared with triangle A 2 B 1 C 2 .
- the table shown in Figure 5 illustrates that the relationship between points A, B and C may in general result in six different permutations. There will always be these combinations of near, mid and far when working toward a solution. Right at the start, one doesn't know which point is closest to the camera and which is furthest and which is midpoint.
- the difference between OA of the first and third triangles is added to the estimate Ob 1 to determine the estimate to be utilized in the next iteration. It is, of course, possible to utilize a factor other than 1 to 1 and to adjust the estimate by a fraction or a multiple of the difference between the values of OA for the first and third triangles. The preferred adjustment, however, is 1 to 1.
- a right angle calibration target be used, like an 8 1/2 x 11 piece of paper or a carpenter's square.
- the six potential arrangements of near, mid and far for points A, B, C can be viewed as different ways of flattening the pyramid.
- Three sets of flattened pyramids can be formed by using each vertex OA, OB and OC as the edge which is "opened" (e.g. If the pyramid were formed by folding paper into a pyramid shape, and one vertex were cut open and the pyramid unfolded into a pattern like that shown in Figure 3, three different sets of flattened pyramids are formed, each by cutting a different vertex).
- Each set has two members corresponding to the two orders in which the triangles may occur. As illustrated in Figure 3, for example, the triangles are solved in 1-2-3 order. This ordering represents one of the 2 members.
- the other member is formed by flipping the flattened pyramid over on its face so that triangle 3, as shown in Figure 3 is put in the triangle 1 position. This member of the set is solved in 3-2-1 order as labeled.
- the techniques described herein are applicable to images photographed without a calibrated target. By selecting 3 convenient points on the image and physically measuring the distance between them after the image has been captured, the same effect can be achieved as is achieved using a calibrated target at the time the image is captured.
- Figure 7 illustrates how azimuthal and elevational corrections are determined.
- Figure 7 illustrates the image plane.
- Points ABC are the same points ABC utilized to define a coordinate system and to calculate the distance of the camera in that coordinate system.
- Points A, B and C are illustrated as part of the image shown in the image plane.
- a center of the plane i.e. the center of the picture
- a calibrated target or the three points utilized to establish a coordinate system, A, B and C are typically not at the center of the photograph.
- the azimuthal correction is essentially that required to displace point A, the image of the origin of the external world coordinate system so that it lies exactly on top of the photographic location of point A shown to the right of axis 710 of the coordinate system of the image plane.
- the elevational correction is the angle of elevation or declination required to place the image of point A exactly on top of the photographic location of point A shown below the abscissa of the image plane coordinate system 700.
- azimuthal and elevational corrections are determined such that if they were applied to the camera, point A, the origin of the real world coordinate system would coincide with point A, the origin as captured on the photograph.
- Figure 7 assumes that if A is correctly located, points B and C will be correctly located. However, this is generally not true because of tilt of the camera about the optical axis. Once points A have been superimposed, one knows where point B should be because of the axis definitions in the real world coordinate system. If the origin of the real world coordinate system centered on A, and the origin of the image plane coordinate system, now also centered on A by virtue of the azimuthal and elevational corrections applied in connection with Figure 7, then point B on the image plane should be located where point B in the real world coordinate system is located. This would be the case if the camera were absolutely horizontal when the picture was taken. However, if there is tilt, B will be displaced off the axis.
- the B point residual error and the C point residual error are utilized as a discriminators.
- Figure 8 illustrates the process utilized to fully determine the location and orientation of a camera from the image.
- step 800 one determines the location of the calibration points A, B and C and either knows or measures the distances between them (810).
- the camera location in XYZ coordinates is determined using the technique set forth in Figure 9. Once the XYZ camera location is determined, corrections are made to azimuth and elevation (830) and then to tilt (840). With azimuth and tilt correction made, one determines whether the points are correctly located within a desired accuracy ⁇ (850). If they are, the location and orientation of the camera is fully determined (860) and the process ends. If they are not, another iteration of steps 830 and 840 is undertaken to bring the location determination within the desired accuracy.
- FIG. 9 illustrates the details of block 820 of
- Figure 8 Knowing the principal distance of the camera, one measures the three angles AOB, BOC and COA from the image plane (900).
- a viewing pyramid is constructed with distance OA assumed as the longest dimension (905).
- the pyramid is flattened and a value estimated for line segment OB which is known to be low (910).
- the first triangle is solved (915).
- Second and third triangles are then sequentially solved using the results of the prior calculations (920 and 925).
- the value ⁇ OA is added to the prior estimate of OB to form a new estimate and a new iteration of steps 915, 920, 925, 930 and 940 occurs. If ⁇ OA ⁇ ⁇ (940), then the viewing pyramid is solved (950) and it is only necessary to resolve the near, mid and far ambiguity (960) before the objective of totally determining the position and orientation of the camera (970) is achieved.
- the coordinates X 0 and Y 0 of the point 0 can be defined with respect to a camera axis by the following . See Figures 11 and 12 .
- Figure 13 illustrates a typical real world situation.
- the points A, B and C represent the calibrated target or the points measured subsequent to image capture.
- the coordinate system X, Y and Z is established in accordance with the conventions set forth above with A as the origin.
- Camera positions 1 and 2 illustrated only by their principal points O 1 and O 2 respectively and their image planes IP 1 and IP 2 respectively, are positioned with their principal points located at O 1 and O 2 and with their optical axis pointed at point T which would be the center of the field on the image plane.
- FIG. 14 illustrates hardware utilized to carry out certain aspects of the invention.
- Camera 1400 is used to capture images to be analyzed in accordance with the invention.
- Camera 1400 may be a digital still camera or a video camera with a frame grabber. Images from the camera are loaded onto computer 1420 using camera interface 1410. Normally, images loaded through interface 1410 would be stored on hard drive 1423 and then later retrieved for processing in video RAM 1430. However, images can be loaded directly into video RAM if desired.
- Video RAM 1430 preferably contains sufficient image storage to permit the simultaneous processing of two images from the camera.
- Video display 1440 is preferably a high resolution video display such as a cathode ray tube or a corresponding display implemented in the semiconductor technology. Display 1440 is interfaced to the computer bus through display at interface 1424 and may be utilized to display individual images or both images simultaneously or three dimensional wire frames created in accordance with the invention.
- Keyboard 1450 is interfaced to the bus over keyboard interface 1422 in the usual manner.
- distance measurements may be conveniently measured in number of pixels in the vertical and horizontal direction which may be translated into linear measurements on the display screen knowing the resolution of the display in vertical and horizontal directions. Numbers of pixels may be readily determined by pointing and clicking on points under consideration and by obtaining the addresses of the pixels clicked upon from the cursor addresses.
- the techniques set forth herein permit accurate forensic surveying of accident or crime scenes as well as accurate surveying of buildings or construction sites, particularly in the vertical direction which had heretofore been practically impossible.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP8535977A JPH11514434A (en) | 1996-03-28 | 1996-03-28 | Method and apparatus for determining camera position and orientation using image data |
AU54067/96A AU5406796A (en) | 1996-03-28 | 1996-03-28 | Methods and apparatus for using image data to determine camera location and orientation |
PCT/IB1996/000424 WO1997036147A1 (en) | 1996-03-28 | 1996-03-28 | Methods and apparatus for using image data to determine camera location and orientation |
EP96911069A EP0892911A1 (en) | 1996-03-28 | 1996-03-28 | Methods and apparatus for using image data to determine camera location and orientation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB1996/000424 WO1997036147A1 (en) | 1996-03-28 | 1996-03-28 | Methods and apparatus for using image data to determine camera location and orientation |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1997036147A1 true WO1997036147A1 (en) | 1997-10-02 |
Family
ID=11004428
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB1996/000424 WO1997036147A1 (en) | 1996-03-28 | 1996-03-28 | Methods and apparatus for using image data to determine camera location and orientation |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP0892911A1 (en) |
JP (1) | JPH11514434A (en) |
AU (1) | AU5406796A (en) |
WO (1) | WO1997036147A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999067746A1 (en) * | 1998-06-24 | 1999-12-29 | Sports Training Technologies, S.L. | Method for capturing, analyzing and representing the movement of bodies and objects |
DE19922321C2 (en) * | 1999-05-14 | 2002-07-18 | Zsp Geodaetische Sys Gmbh | Method and arrangement for performing geodetic measurements using a video tachymeter |
DE19922341C2 (en) * | 1999-05-14 | 2002-08-29 | Zsp Geodaetische Sys Gmbh | Method and arrangement for determining the spatial coordinates of at least one object point |
EP1394502A2 (en) * | 2002-08-29 | 2004-03-03 | Olympus Optical Co., Ltd. | Calibration pattern unit |
EP2166510A1 (en) * | 2008-09-18 | 2010-03-24 | Delphi Technologies, Inc. | Method for calculating the position and orientation of a camera in a vehicle |
US20130141461A1 (en) * | 2011-12-06 | 2013-06-06 | Tom Salter | Augmented reality camera registration |
US9160979B1 (en) * | 2011-05-27 | 2015-10-13 | Trimble Navigation Limited | Determining camera position for a photograph having a displaced center of projection |
US10539412B2 (en) | 2014-07-31 | 2020-01-21 | Hewlett-Packard Development Company, L.P. | Measuring and correcting optical misalignment |
CN112470188A (en) * | 2018-05-25 | 2021-03-09 | 艾奎菲股份有限公司 | System and method for multi-camera placement |
CN114842164A (en) * | 2022-06-17 | 2022-08-02 | 中国人民解放军陆军炮兵防空兵学院 | Method and system for calculating coordinates of frying points based on three-dimensional geographic model |
US20230305553A1 (en) * | 2019-07-31 | 2023-09-28 | Textron Innovations Inc. | Navigation system with camera assist |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8855442B2 (en) * | 2012-04-30 | 2014-10-07 | Yuri Owechko | Image registration of multimodal data using 3D-GeoArcs |
EP3178884B1 (en) | 2015-12-08 | 2018-02-07 | Evonik Degussa GmbH | Aqueous [3- (2,3-dihydroxyprop-1-oxy) propyl] silanololigomer containing composition, method for their preparation and their use |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4965840A (en) * | 1987-11-27 | 1990-10-23 | State University Of New York | Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system |
US5259037A (en) * | 1991-02-07 | 1993-11-02 | Hughes Training, Inc. | Automated video imagery database generation using photogrammetry |
US5361217A (en) * | 1992-05-07 | 1994-11-01 | Fuji Photo Optical Co., Ltd. | Position measuring/plotting apparatus |
US5365597A (en) * | 1993-06-11 | 1994-11-15 | United Parcel Service Of America, Inc. | Method and apparatus for passive autoranging using relaxation |
-
1996
- 1996-03-28 AU AU54067/96A patent/AU5406796A/en not_active Abandoned
- 1996-03-28 JP JP8535977A patent/JPH11514434A/en active Pending
- 1996-03-28 EP EP96911069A patent/EP0892911A1/en not_active Withdrawn
- 1996-03-28 WO PCT/IB1996/000424 patent/WO1997036147A1/en not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4965840A (en) * | 1987-11-27 | 1990-10-23 | State University Of New York | Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system |
US5259037A (en) * | 1991-02-07 | 1993-11-02 | Hughes Training, Inc. | Automated video imagery database generation using photogrammetry |
US5361217A (en) * | 1992-05-07 | 1994-11-01 | Fuji Photo Optical Co., Ltd. | Position measuring/plotting apparatus |
US5365597A (en) * | 1993-06-11 | 1994-11-15 | United Parcel Service Of America, Inc. | Method and apparatus for passive autoranging using relaxation |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999067746A1 (en) * | 1998-06-24 | 1999-12-29 | Sports Training Technologies, S.L. | Method for capturing, analyzing and representing the movement of bodies and objects |
DE19922321C2 (en) * | 1999-05-14 | 2002-07-18 | Zsp Geodaetische Sys Gmbh | Method and arrangement for performing geodetic measurements using a video tachymeter |
DE19922341C2 (en) * | 1999-05-14 | 2002-08-29 | Zsp Geodaetische Sys Gmbh | Method and arrangement for determining the spatial coordinates of at least one object point |
EP1394502A2 (en) * | 2002-08-29 | 2004-03-03 | Olympus Optical Co., Ltd. | Calibration pattern unit |
EP1394502A3 (en) * | 2002-08-29 | 2006-10-04 | Olympus Optical Co., Ltd. | Calibration pattern unit |
EP2166510A1 (en) * | 2008-09-18 | 2010-03-24 | Delphi Technologies, Inc. | Method for calculating the position and orientation of a camera in a vehicle |
CN101676686B (en) * | 2008-09-18 | 2013-01-02 | 德尔菲技术公司 | Method for calculating the position and orientation of a camera in a vehicle |
US9160979B1 (en) * | 2011-05-27 | 2015-10-13 | Trimble Navigation Limited | Determining camera position for a photograph having a displaced center of projection |
US20130141461A1 (en) * | 2011-12-06 | 2013-06-06 | Tom Salter | Augmented reality camera registration |
US9524436B2 (en) * | 2011-12-06 | 2016-12-20 | Microsoft Technology Licensing, Llc | Augmented reality camera registration |
US10539412B2 (en) | 2014-07-31 | 2020-01-21 | Hewlett-Packard Development Company, L.P. | Measuring and correcting optical misalignment |
CN112470188A (en) * | 2018-05-25 | 2021-03-09 | 艾奎菲股份有限公司 | System and method for multi-camera placement |
US12010431B2 (en) | 2018-05-25 | 2024-06-11 | Packsize Llc | Systems and methods for multi-camera placement |
US20230305553A1 (en) * | 2019-07-31 | 2023-09-28 | Textron Innovations Inc. | Navigation system with camera assist |
US11914362B2 (en) * | 2019-07-31 | 2024-02-27 | Textron Innovations, Inc. | Navigation system with camera assist |
CN114842164A (en) * | 2022-06-17 | 2022-08-02 | 中国人民解放军陆军炮兵防空兵学院 | Method and system for calculating coordinates of frying points based on three-dimensional geographic model |
CN114842164B (en) * | 2022-06-17 | 2023-04-07 | 中国人民解放军陆军炮兵防空兵学院 | Method and system for calculating coordinates of frying points based on three-dimensional geographic model |
Also Published As
Publication number | Publication date |
---|---|
JPH11514434A (en) | 1999-12-07 |
AU5406796A (en) | 1997-10-17 |
EP0892911A1 (en) | 1999-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5699444A (en) | Methods and apparatus for using image data to determine camera location and orientation | |
US11200734B2 (en) | Method for reconstructing three-dimensional space scene based on photographing | |
US6246412B1 (en) | Interactive construction and refinement of 3D models from multiple panoramic images | |
CN110296691B (en) | IMU calibration-fused binocular stereo vision measurement method and system | |
JP4685313B2 (en) | Method for processing passive volumetric image of any aspect | |
Pollefeys et al. | Self-calibration and metric reconstruction inspite of varying and unknown intrinsic camera parameters | |
US8107722B2 (en) | System and method for automatic stereo measurement of a point of interest in a scene | |
US6271855B1 (en) | Interactive construction of 3D models from panoramic images employing hard and soft constraint characterization and decomposing techniques | |
US6084592A (en) | Interactive construction of 3D models from panoramic images | |
Wang et al. | Accurate georegistration of point clouds using geographic data | |
EP0892911A1 (en) | Methods and apparatus for using image data to determine camera location and orientation | |
US8509522B2 (en) | Camera translation using rotation from device | |
Rawlinson | Design and implementation of a spatially enabled panoramic virtual reality prototype | |
Dhome | Visual Perception Through Video Imagery | |
Negahdaripour et al. | Integrated system for robust 6-dof positioning utilizing new closed-form visual motion estimation methods in planar terrains | |
Ahmadabadian | Photogrammetric multi-view stereo and imaging network design | |
Perfant et al. | Scene registration in aerial image analysis | |
Huang et al. | Calibration of line-based panoramic cameras | |
Stylianidis et al. | Measurements: Introduction to Photogrammetry | |
Erdnüß | A review of the one-parameter division undistortion model | |
Scheibe | Design and test of algorithms for the evaluation of modern sensors in close-range photogrammetry | |
Chen et al. | A Vision-aided Localization and Geo-registration Method for Urban ARGIS Based on 2D Maps. | |
CN115375748A (en) | Deformation quantity determining method and device and electronic equipment | |
Srestasathiern | Line Based Estimation of Object Space Geometry and Camera Motion | |
Wang | Using Linear Features for Aerial Image Sequence Mosaiking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 1996911069 Country of ref document: EP |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN AM AZ BY KG KZ MD RU TJ TM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1996911069 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1996911069 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: CA |