EP0892911A1 - Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera - Google Patents

Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera

Info

Publication number
EP0892911A1
EP0892911A1 EP96911069A EP96911069A EP0892911A1 EP 0892911 A1 EP0892911 A1 EP 0892911A1 EP 96911069 A EP96911069 A EP 96911069A EP 96911069 A EP96911069 A EP 96911069A EP 0892911 A1 EP0892911 A1 EP 0892911A1
Authority
EP
European Patent Office
Prior art keywords
points
point
images
location
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP96911069A
Other languages
German (de)
English (en)
Inventor
Charles S. Palm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Synthonics Inc
Original Assignee
Synthonics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Synthonics Inc filed Critical Synthonics Inc
Publication of EP0892911A1 publication Critical patent/EP0892911A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the invention relates to the field of image processing and more particularly to methods and apparatus for determining camera position and orientation from an image captured with that camera and to accurate surveying using such methods and apparatus.
  • Stereoscopic photographic cameras which utilize a single camera body and two objective lenses separated by a fixed distance, usually corresponding to the interocular distance.
  • Other such cameras use a single objective and external arrangements which form two image areas on film positioned on the camera's image plane.
  • Still other arrangements use two separate cameras separated by a fixed distance to form images corresponding to a left and right eye view of the scene being photographed.
  • stereoscopic photographic images of the prior art are developed, they are often viewed through separate eye pieces, one for each eye. Each eye piece projects a view of a respective one of the developed images which the user's eyes would have seen had the eyes viewed the scene directly. Depth is clearly discernable when viewing a stereoscopic image.
  • Calculations of depth is a difficult task when using images captured from different positions vis-a-vis the scene being photographed because the planar relationships which result from projection of a three dimensional scene onto a two dimensional plane do not undergo a linear transformation or mapping compared with the same points projected onto a different image plane.
  • Different portions of a scene viewed from one point relate differently to corresponding points from the same scene viewed from another point.
  • Planar surfaces which are viewed normally in one view are reduced in extent when viewed obliquely.
  • Aerial surveying is also known. Images are captured from an airplane or other vehicle in transit over an area to be surveyed at positions which are precisely known by modern navigation techniques. Position of significant ground features can then be calculated using sophisticated image processing techniques which often require supercomputers. Aerial surveying techniques have the advantage that they can be accomplished without the need to place people on the ground in the area to be surveyed. Inaccessible terrain can also be surveyed in this way. However, expensive image capture equipment is required and even with very good optics and image processing, the resolution is not always as good as one might like. Also, accurate measurements in the vertical direction are even more difficult to take using aerial techniques.
  • 3-D representation such as a wireframe
  • CAD/CAM computer assisted design or computer assisted manufacturing
  • Every recorded image whether it be a photograph, a video frame, a true perspective drawing or other form of recorded image, has associated with it a viewing location and viewing look angles that exactly describe the orientation of the recording mechanism relative to the recorded scene.
  • camera location was either estimated or known a priori by locating the position from which the picture was taken using surveying techniques.
  • rotation angle was assumed to be 0 (horizontal) and elevation and azimuth were either measured with varying degrees of accuracy or estimated.
  • surveying and measurement increase the set up time required before capturing images for analysis, often to the point where any hope of accurate measurements would be abandoned in favor of qualitative information which could be gleaned from images captured under uncontrolled conditions.
  • stereo photographs are frequently used to investigate and document accident or crime scenes.
  • the accuracy of the documentation depends to a high degree on knowing exactly the viewing parameters of the cameras at the time the photographs were taken.
  • Computer-generated renderings are often merged with actual photographs to convey an image of a completed construction project while still in the planning and review stages.
  • the viewing parameters of the computer rendering it is necessary for the viewing parameters of the computer rendering to be exactly the same as the viewing parameters of the camera that took the photograph.
  • the viewing parameters for any given recorded image are unknown and difficult to determine with a high degree of accuracy, even when the camera positions are physically measured relative to some established coordinate system.
  • the difficulties arise from the fact that the camera lens principle points are usually located inside the lens structure and therefore inaccessible for purposes of direct measurement.
  • the measurement of viewing angles is even more difficult to accomplish without the use of surveying type tripods, levels and transits.
  • Photogrammetry is a science that deals with measurements made from photographs. Generally, photogrammetrists use special camera equipment that generates fiducial marks on the photographs to assist in determining the viewing parameters. Non-photogrammetric cameras can be used in some analyses, however the associated techniques generally require knowing the locations of a large number of calibration points (five or more) that are identifiable in the recorded scene. Generally, the three-dimensional location of five or more calibration points need to be known in terms of some orthogonal, reference coordinate system, in order to determine the viewing parameters.
  • the Direct Linear Transform (DLT) is a five-point calibration procedure that is sometimes employed by photogrammitrists.
  • the Church resection model may be used when the optical axis of an aerial camera lens is within four or five degrees of looking vertically down on the terrain. Angular displacements from the vertical of more than a few degrees results in noticeable mathematical nonlinearities that are associated with transcendental trigonometric functions. Under these conditions, the Church resection model is no longer valid and the three-point calibration procedure no longer applies.
  • the problems of the prior art are overcome in accordance with the invention by automatically identifying camera location and orientation based on image content. This can be done either by placing a calibrated target within the field of the camera or by measuring the distances among three relatively permanent points in the scene of images previously captured. Using the points, the location and orientation of a camera at the time a picture was take can be precisely identified for each picture. Once the location and orientation of the camera are known precisely for each of two or more pictures, accurate 3-D positional information can be calculated for all other identifiable points on the images, thus permitting an accurate survey of the scene or object.
  • the images can be captured by a single camera and then used to generate stereo images or stereo wireframes.
  • the above and other objects and advantages of the invention are achieved by providing a method of measuring the absolute three dimensional location of points, such as point D of Figerie 1 with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data.
  • the image data is captured by using one or more cameras of known focal length to capture two images of a scene containing the points A, B, C and D.
  • the location and orientation of the camera (s) at the time each of said images was captured is determined with reference to said coordinate system by using information derived from said images, the known focal length and the known distances,
  • the locations of the cameras at the time the images were captured is then utilized with other image data, to determine the location of points such as point D.
  • the step of using the locations of the cameras at the time the images were captured to determine the location of said point D from image data includes defining an auxiliary coordinate system with origin along the line joining the locations of the cameras, defining the center point of each image as an origin of a set of image reference axes pointing in X', Y' and Z' directions, respectively, measuring offset in at least one of the X' and Y' directions of a point on the first image and of a corresponding point of a second image, determining the angles formed between a line joining point D, the focal point of the objective and the image of point D on one of the X' or Y' planes for each of the images, determining said distance h using the measured offsets, the focal length and the angles, determining the X' and Y' coordinates of point D in the auxiliary coordinate system, and transforming coordinates (X', Y', h) of the auxiliary coordinate system to a representation in said coordinate system defined using said three points, A, B and C.
  • the step of determining the location and orientation of said one or more cameras at the time said images were captured with reference to said coordinate system using image data, known focal length and said known distances includes representing the distance between point A, B and C and the focal point of a camera 0 as a viewing pyramid, modifying the representation of the pyramid to a joined three triangle flattened representation, selecting a low estimate Ob 1 for one interior side of a first triangle of said flattened representation, solving the first triangle using image data, known focal length and said known distances, yielding, inter alia, a first calculated value for length OA, given estimate Ob 1 ,
  • the process of deriving values for camera location using distances OA, OB and OC comprises solving simultaneously equations for spheres centered at points A, B and C with respective radii of OA, OB and OC.
  • one calculates the azimuthal and elevational adjustment required to direct the camera to the location of point A and calculates the amount of rotation about the optical axis required to align point B once the camera points at point A. This is done interactively until the degree of alignment is within the desired degree of accuracy.
  • the invention can be used to measure the distance between two points especially in a vertical direction, to locate the physical position of objects visible in images accurately, to create a three dimensional wireframe representation and to document the "as built" condition of an object.
  • the invention is also directed to a method of measuring the absolute three dimensional location 0 of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data by capturing an image of a scene containing the points A, B, and C, using a camera, determining or knowing a priori the focal length of said camera, determining the location of said camera at the time said image was captured with reference to said coordinate system using information derived from said image, known focal length and said known distances.
  • the invention is also directed to a method of measuring distance including vertical height by measuring the absolute three dimensional location of points D, E and F with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data using techniques described above, by determining distances between points D, E and F, and by using the location of said points D, E and F and the location of cameras at the time images were captured to determine the location of other points.
  • the other points may be optionally located on images different from those used to determine the location of points D, E and F.
  • the invention is also directed to apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data
  • apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data including one or more cameras for capturing images of a scene containing the points A, B, C and D, a memory interfaced to the camera (s) for storing images captured by the camera (s), a computer for processing stored images to determine the location and orientation of the camera (s) at the time each of said images was captured with reference to said coordinate system, using information derived from said images, known focal length and said known distances, and for using the locations of said one or more cameras at the time the images were captured to determine the location of said point D from image data.
  • Location information can be stored in a database which can be used for different purposes. For example, it can be used to store a three dimensional wireframe representation or
  • Figure 1 is an illustration of the capture of two images of a scene, including a building, according to the invention.
  • Figure 2 is an illustration of a viewing pyramid of three calibration points as projected through the focal point of a camera.
  • Figure 3 is an illustration of a flattened pyramid used for calculation of camera distance.
  • Figure 4 is an illustration of viewing angle determination used in calculation of camera distance.
  • Figure 5 is an illustration of near, mid and far ambiguity.
  • Figure 6 is an illustration of how to resolve near, mid and far ambiguity.
  • Figure 7 is an illustration of azimuthal and elevational correction.
  • Figure 8 is a flow chart of the algorithm used to determine camera distance and orientation.
  • Figure 9 is a flow chart of the algorithm used to calculate camera location.
  • Figure 10 is an illustration of how to calculate the distance of a point from a line joining the principal points of two cameras.
  • Figure 11 is an illustration of the calculation of the location of a point in the X direction.
  • Figure 12 is an illustration of the calculation of the location of a point in the Y direction.
  • Figure 13 is an illustration of how to calculate point location generally given a determination of the location and orientation of the camera at the time when two images were captured.
  • FIG. 14 is an illustration of hardware utilized in accordance with the invention.
  • Figure 1 illustrates a building 100 in front of which is located a calibrated target such as a builder's square 110.
  • Pictures of the building are taken from two positions. The first from point f 1 and the second from point f 2 .
  • f 1 is the location of the principal point of the lens or lens system of a camera and the image projected through that point falls on image plane fp 1
  • a second image of the scene is captured from position f 2 and the image through principal point f 2 is cast upon image plane fp 2 .
  • the positioning of the cameras is arbitrary. In some circumstances, it is desirable to capture images from two locations using the same camera. In other circumstances, it may be desirable to capture the images using different cameras.
  • the camera is aimed so as to center the object of interest within the viewing frame.
  • both cameras are pointed at center point
  • a real world coordinate system is defined with the Y axis running through points A and C and an X axis defined perpendicular to the Y axis through point A in the plane of A, B and C, thus forming an origin O at point A.
  • a Z axis is defined perpendicular to the XY plane and running through point A.
  • principal point f 1 is located at (X 1 , Y 1 , Z 1 ).
  • principal point f 2 is located at (X 2 , Y 2 , Z 2 ).
  • a camera directed at target point T has both an azimuth and an elevation which can be specified utilizing the coordinate system.
  • the camera may be rotated about the optical axis of the camera differently when the two pictures were taken. In short, there is no guarantee that the camera was horizontal to the XY plane when the picture was taken and thus, the orientation of the images may require correction prior to processing.
  • Figure 2 illustrates a viewing pyramid formed by the three points A, B and C vis-a-vis the origin O (the principal point of a camera).
  • the viewing pyramid can be viewed as having three surfaces, each corresponding to a surface triangle, namely, triangles AOB, BOC and COA. If one were to view the pyramid shown in Figure 2 as hollow and made of paper and if one were to cut along the line OA and flatten the resulting pattern, one would achieve a flattened pyramid such as shown in Figure 3.
  • the distance OA represents the distance from point A which is at the origin of the coordinate system to point O which is at the principal point of the lens.
  • angles AOB, AOC and BOC are known by virtue of knowing the distance between the principal point and the image plane and the measured distance separating two points on the image plane.
  • Figure 4 assists in illustrating how this is done.
  • the XY plane constitutes the image plane of the camera.
  • f 0 is the principal point of the lens.
  • Images of points A and B are formed on the image plane after passing through the principal point at locations A and B shown on the XY plane.
  • the incoming rays from points A and B are respectively shown at 400 and 410 of Figure 4.
  • an image plane origin FP 0 is defined and an X axis is defined as parallel to the longest dimension of the image aspect ratio.
  • the Y axis is formed perpendicular thereto, and the origin FP 0 lies directly under the principal point.
  • Rays from points A and B form an angle alpha ( ⁇ ) as they pass through the focal point.
  • the projection of those rays beyond the focal point also diverge at ⁇ .
  • corresponds to ⁇ AOB of Figure 3.
  • the distances AFP 0 and BFP 0 can be determined.
  • the angles separating points A, B and C can be determined in the manner just described.
  • the distances separating points A, B and C are also known, either a priori by placing a calibrated target, such as a carpenter's square in the scene being photographed, or by measuring the distances between three relatively permanent points in the scene previously captured after the images have been formed.
  • the distance OA represents the distance from the principal point of the camera (O) to point A which is the origin of the coordinate system utilized to define camera position. At a high level, this is done by first assuming a very low estimate for the distance OB, such as the distance Ob 1 , then with that assumption, triangle AOB is solved. "Solving a triangle” means establishing (e.g. calculating) values for the length of each side and for each of the angles within the triangle. With the distance Ob 1 assumed, the first triangle is solved using known, assumed or calculated values. In the process, a value for distance OA is calculated. Using the estimate Ob 1 , the second triangle BOC is solved and the derived distance OC is then utilized to solve the third triangle COA.
  • the calculated value for OA of the third triangle is compared with the calculated value of OA of the first triangle and the estimate Ob 1 is revised by adding the difference between the values for OA from the third triangle and the value for OA from the first triangle to the estimate Ob 1 and the process is repeated.
  • the estimate Ob 1 will be improved until the difference between the calculated values of OA reduces to a value less than ⁇ .
  • is low enough for the accuracy needed, the iterations cease and the true value of OA is assumed to lie between the values calculated for the first and third triangles.
  • Distance Ob 1 is the estimate of the length of OB, which, at the outset, is set to be low.
  • the distance AB is known because the dimensions of a calibrated target are known or because the distance AB has been measured after the images are captured.
  • the value for ⁇ AOB is calculated from measurements from the image plane as illustrated in Figure 4 and discussed in connection with equations 1-7. Therefore, ⁇ OAB can be calculated as follows:
  • the first estimate of ⁇ OBA can be calculated as follows:
  • Ob 1 is assumed to be the distance OB.
  • Distance BC is known from the target or measurements and angle BOC is known from measurements from the image plane.
  • the third triangle can be solved in a manner completely analogously to the solution of the second triangle substituting in the corresponding lengths and angles of the third triangle in equations 8-12.
  • the distance OA which has been calculated as set forth above.
  • This distance OA from the third triangle will have been derived based on calculations from the first, second and third triangles. Note, however, that the distance OA from the third triangle and the distance OA from the first triangle should be identical if the assumed value Ob 1 were equal in fact to the real length OB. Since Ob 1 was initially assumed to be of very low value, there will be generally a difference between the value of OA from the third triangle as compared with that from the first triangle. The difference between the two calculated lengths is added to original estimate Ob 1 to form an estimate Ob 2 for the second iteration.
  • the estimate for the distance OB can be made accurate to whatever degree of resolution one desires by continuing the iterative process until the difference between OA from the first triangle and that from the third triangle is reduced to an acceptable level, ⁇ .
  • the distance OA which results from the iterative process is then equal to the distance of the principal point of the camera shown at 0 in Figure 3 to point A which is the origin of the coordinate system defined for this set of measurements.
  • FIG 5 when viewing the points A, B and C from the principal point of the camera, one cannot necessarily determine which of points A, B and C are closest and next closest to the camera. For example, in Figure 5, given that point B 1 is closest to the camera, it is possible that either point A is closer and point C farther, or alternatively, that point C is closer and point A farther. These differences are reflected in triangles A 1 B 1 C 1 as compared with triangle A 2 B 1 C 2 .
  • the table shown in Figure 5 illustrates that the relationship between points A, B and C may in general result in six different permutations. There will always be these combinations of near, mid and far when working toward a solution. Right at the start, one doesn't know which point is closest to the camera and which is furthest and which is midpoint.
  • the difference between OA of the first and third triangles is added to the estimate Ob 1 to determine the estimate to be utilized in the next iteration. It is, of course, possible to utilize a factor other than 1 to 1 and to adjust the estimate by a fraction or a multiple of the difference between the values of OA for the first and third triangles. The preferred adjustment, however, is 1 to 1.
  • a right angle calibration target be used, like an 8 1/2 x 11 piece of paper or a carpenter's square.
  • the six potential arrangements of near, mid and far for points A, B, C can be viewed as different ways of flattening the pyramid.
  • Three sets of flattened pyramids can be formed by using each vertex OA, OB and OC as the edge which is "opened" (e.g. If the pyramid were formed by folding paper into a pyramid shape, and one vertex were cut open and the pyramid unfolded into a pattern like that shown in Figure 3, three different sets of flattened pyramids are formed, each by cutting a different vertex).
  • Each set has two members corresponding to the two orders in which the triangles may occur. As illustrated in Figure 3, for example, the triangles are solved in 1-2-3 order. This ordering represents one of the 2 members.
  • the other member is formed by flipping the flattened pyramid over on its face so that triangle 3, as shown in Figure 3 is put in the triangle 1 position. This member of the set is solved in 3-2-1 order as labeled.
  • the techniques described herein are applicable to images photographed without a calibrated target. By selecting 3 convenient points on the image and physically measuring the distance between them after the image has been captured, the same effect can be achieved as is achieved using a calibrated target at the time the image is captured.
  • Figure 7 illustrates how azimuthal and elevational corrections are determined.
  • Figure 7 illustrates the image plane.
  • Points ABC are the same points ABC utilized to define a coordinate system and to calculate the distance of the camera in that coordinate system.
  • Points A, B and C are illustrated as part of the image shown in the image plane.
  • a center of the plane i.e. the center of the picture
  • a calibrated target or the three points utilized to establish a coordinate system, A, B and C are typically not at the center of the photograph.
  • the azimuthal correction is essentially that required to displace point A, the image of the origin of the external world coordinate system so that it lies exactly on top of the photographic location of point A shown to the right of axis 710 of the coordinate system of the image plane.
  • the elevational correction is the angle of elevation or declination required to place the image of point A exactly on top of the photographic location of point A shown below the abscissa of the image plane coordinate system 700.
  • azimuthal and elevational corrections are determined such that if they were applied to the camera, point A, the origin of the real world coordinate system would coincide with point A, the origin as captured on the photograph.
  • Figure 7 assumes that if A is correctly located, points B and C will be correctly located. However, this is generally not true because of tilt of the camera about the optical axis. Once points A have been superimposed, one knows where point B should be because of the axis definitions in the real world coordinate system. If the origin of the real world coordinate system centered on A, and the origin of the image plane coordinate system, now also centered on A by virtue of the azimuthal and elevational corrections applied in connection with Figure 7, then point B on the image plane should be located where point B in the real world coordinate system is located. This would be the case if the camera were absolutely horizontal when the picture was taken. However, if there is tilt, B will be displaced off the axis.
  • the B point residual error and the C point residual error are utilized as a discriminators.
  • Figure 8 illustrates the process utilized to fully determine the location and orientation of a camera from the image.
  • step 800 one determines the location of the calibration points A, B and C and either knows or measures the distances between them (810).
  • the camera location in XYZ coordinates is determined using the technique set forth in Figure 9. Once the XYZ camera location is determined, corrections are made to azimuth and elevation (830) and then to tilt (840). With azimuth and tilt correction made, one determines whether the points are correctly located within a desired accuracy ⁇ (850). If they are, the location and orientation of the camera is fully determined (860) and the process ends. If they are not, another iteration of steps 830 and 840 is undertaken to bring the location determination within the desired accuracy.
  • FIG. 9 illustrates the details of block 820 of
  • Figure 8 Knowing the principal distance of the camera, one measures the three angles AOB, BOC and COA from the image plane (900).
  • a viewing pyramid is constructed with distance OA assumed as the longest dimension (905).
  • the pyramid is flattened and a value estimated for line segment OB which is known to be low (910).
  • the first triangle is solved (915).
  • Second and third triangles are then sequentially solved using the results of the prior calculations (920 and 925).
  • the value ⁇ OA is added to the prior estimate of OB to form a new estimate and a new iteration of steps 915, 920, 925, 930 and 940 occurs. If ⁇ OA ⁇ ⁇ (940), then the viewing pyramid is solved (950) and it is only necessary to resolve the near, mid and far ambiguity (960) before the objective of totally determining the position and orientation of the camera (970) is achieved.
  • the coordinates X 0 and Y 0 of the point 0 can be defined with respect to a camera axis by the following . See Figures 11 and 12 .
  • Figure 13 illustrates a typical real world situation.
  • the points A, B and C represent the calibrated target or the points measured subsequent to image capture.
  • the coordinate system X, Y and Z is established in accordance with the conventions set forth above with A as the origin.
  • Camera positions 1 and 2 illustrated only by their principal points O 1 and O 2 respectively and their image planes IP 1 and IP 2 respectively, are positioned with their principal points located at O 1 and O 2 and with their optical axis pointed at point T which would be the center of the field on the image plane.
  • FIG. 14 illustrates hardware utilized to carry out certain aspects of the invention.
  • Camera 1400 is used to capture images to be analyzed in accordance with the invention.
  • Camera 1400 may be a digital still camera or a video camera with a frame grabber. Images from the camera are loaded onto computer 1420 using camera interface 1410. Normally, images loaded through interface 1410 would be stored on hard drive 1423 and then later retrieved for processing in video RAM 1430. However, images can be loaded directly into video RAM if desired.
  • Video RAM 1430 preferably contains sufficient image storage to permit the simultaneous processing of two images from the camera.
  • Video display 1440 is preferably a high resolution video display such as a cathode ray tube or a corresponding display implemented in the semiconductor technology. Display 1440 is interfaced to the computer bus through display at interface 1424 and may be utilized to display individual images or both images simultaneously or three dimensional wire frames created in accordance with the invention.
  • Keyboard 1450 is interfaced to the bus over keyboard interface 1422 in the usual manner.
  • distance measurements may be conveniently measured in number of pixels in the vertical and horizontal direction which may be translated into linear measurements on the display screen knowing the resolution of the display in vertical and horizontal directions. Numbers of pixels may be readily determined by pointing and clicking on points under consideration and by obtaining the addresses of the pixels clicked upon from the cursor addresses.
  • the techniques set forth herein permit accurate forensic surveying of accident or crime scenes as well as accurate surveying of buildings or construction sites, particularly in the vertical direction which had heretofore been practically impossible.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Cette invention se rapporte à des procédés et appareils pour examiner et déterminer avec précision la position physique d'objets se trouvant dans une scène, ces procédés et appareils utilisant des données d'images capturées par une ou plusieurs caméras et trois points de la même scène qui peuvent être soit mesurés après la capture des images soit inclus dans la cible étalonnée placée dans la scène au moment de la capture (800) des images. Les objets sont localisés par rapport à un système de coordonnées tridimensionnel défini en référence aux trois points (820). Ces procédés et appareils permettent un établissement et une capture rapides de données de positions précises, au moyen d'un appareil simple et d'un traitement d'images simple. La position et l'orientation précises de la caméra utilisée pour capturer chaque scène sont déterminées à partir des données d'images (860), à partir de la position des trois points et à partir des paramètres optiques de la caméra.
EP96911069A 1996-03-28 1996-03-28 Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera Withdrawn EP0892911A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB1996/000424 WO1997036147A1 (fr) 1996-03-28 1996-03-28 Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera

Publications (1)

Publication Number Publication Date
EP0892911A1 true EP0892911A1 (fr) 1999-01-27

Family

ID=11004428

Family Applications (1)

Application Number Title Priority Date Filing Date
EP96911069A Withdrawn EP0892911A1 (fr) 1996-03-28 1996-03-28 Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera

Country Status (4)

Country Link
EP (1) EP0892911A1 (fr)
JP (1) JPH11514434A (fr)
AU (1) AU5406796A (fr)
WO (1) WO1997036147A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3178884A1 (fr) 2015-12-08 2017-06-14 Evonik Degussa GmbH Composition contenant [3-(2,3-dihydroxyprop-1-oxy)propyl]oligomère de silanol aqueux, son procédé de fabrication et d'utilisation

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU7768298A (en) * 1998-06-24 2000-01-10 Sports Training Technologies, S.L. Method for capturing, analyzing and representing the movement of bodies and objects
DE19922321C2 (de) * 1999-05-14 2002-07-18 Zsp Geodaetische Sys Gmbh Verfahren und Anordnung zur Durchführung von geodätischen Messungen mittels Videotachymeter
DE19922341C2 (de) * 1999-05-14 2002-08-29 Zsp Geodaetische Sys Gmbh Verfahren und eine Anordnung zur Bestimmung der räumlichen Koordinaten mindestens eines Objektpunktes
JP3635540B2 (ja) * 2002-08-29 2005-04-06 オリンパス株式会社 キャリブレーションパターンユニット
EP2166510B1 (fr) * 2008-09-18 2018-03-28 Delphi Technologies, Inc. Procédé de détermination de la position et de l'orientation d'une caméra installée dans un véhicule
US9160979B1 (en) * 2011-05-27 2015-10-13 Trimble Navigation Limited Determining camera position for a photograph having a displaced center of projection
US9524436B2 (en) * 2011-12-06 2016-12-20 Microsoft Technology Licensing, Llc Augmented reality camera registration
US8855442B2 (en) * 2012-04-30 2014-10-07 Yuri Owechko Image registration of multimodal data using 3D-GeoArcs
WO2016018411A1 (fr) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Mesure et correction de désalignement optique
CA3141704A1 (fr) * 2018-05-25 2019-11-28 Packsize International Llc Systemes et procedes pour un positionnement de multiples cameras
US11022972B2 (en) * 2019-07-31 2021-06-01 Bell Textron Inc. Navigation system with camera assist
CN114842164B (zh) * 2022-06-17 2023-04-07 中国人民解放军陆军炮兵防空兵学院 一种基于三维地理模型的炸点坐标计算方法及系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4965840A (en) * 1987-11-27 1990-10-23 State University Of New York Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system
US5259037A (en) * 1991-02-07 1993-11-02 Hughes Training, Inc. Automated video imagery database generation using photogrammetry
US5361217A (en) * 1992-05-07 1994-11-01 Fuji Photo Optical Co., Ltd. Position measuring/plotting apparatus
US5365597A (en) * 1993-06-11 1994-11-15 United Parcel Service Of America, Inc. Method and apparatus for passive autoranging using relaxation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9736147A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3178884A1 (fr) 2015-12-08 2017-06-14 Evonik Degussa GmbH Composition contenant [3-(2,3-dihydroxyprop-1-oxy)propyl]oligomère de silanol aqueux, son procédé de fabrication et d'utilisation
US10266655B2 (en) 2015-12-08 2019-04-23 Evonik Degussa Gmbh Aqueous composition comprising [3-(2,3-dihydroxyprop-1-oxy)propyl] silanol oligomers, process for preparation thereof and use thereof

Also Published As

Publication number Publication date
AU5406796A (en) 1997-10-17
WO1997036147A1 (fr) 1997-10-02
JPH11514434A (ja) 1999-12-07

Similar Documents

Publication Publication Date Title
US5699444A (en) Methods and apparatus for using image data to determine camera location and orientation
US11200734B2 (en) Method for reconstructing three-dimensional space scene based on photographing
US6246412B1 (en) Interactive construction and refinement of 3D models from multiple panoramic images
CN110296691B (zh) 融合imu标定的双目立体视觉测量方法与系统
JP4685313B2 (ja) 任意の局面の受動的な体積画像の処理方法
Pollefeys et al. Self-calibration and metric reconstruction inspite of varying and unknown intrinsic camera parameters
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
US6271855B1 (en) Interactive construction of 3D models from panoramic images employing hard and soft constraint characterization and decomposing techniques
US6084592A (en) Interactive construction of 3D models from panoramic images
Wang et al. Accurate georegistration of point clouds using geographic data
EP0892911A1 (fr) Procedes et appareils utilisant des donnees d'images pour determiner la position et l'orientation d'une camera
US8509522B2 (en) Camera translation using rotation from device
TW565736B (en) Method for determining the optical parameters of a camera
CN115830116A (zh) 一种鲁棒视觉里程计方法
Rawlinson Design and implementation of a spatially enabled panoramic virtual reality prototype
Negahdaripour et al. Integrated system for robust 6-dof positioning utilizing new closed-form visual motion estimation methods in planar terrains
Ahmadabadian Photogrammetric multi-view stereo and imaging network design
Perfant et al. Scene registration in aerial image analysis
Huang et al. Calibration of line-based panoramic cameras
Stylianidis et al. Measurements: Introduction to Photogrammetry
Erdnüß A review of the one-parameter division undistortion model
Scheibe Design and test of algorithms for the evaluation of modern sensors in close-range photogrammetry
Chen et al. A Vision-aided Localization and Geo-registration Method for Urban ARGIS Based on 2D Maps.
CN115375748A (zh) 一种形变量确定方法、装置及电子设备
Srestasathiern Line Based Estimation of Object Space Geometry and Camera Motion

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19970930

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 19990702