WO1998008152A2 - Appareil de production de maquettes par systeme visionique - Google Patents

Appareil de production de maquettes par systeme visionique Download PDF

Info

Publication number
WO1998008152A2
WO1998008152A2 PCT/US1996/013603 US9613603W WO9808152A2 WO 1998008152 A2 WO1998008152 A2 WO 1998008152A2 US 9613603 W US9613603 W US 9613603W WO 9808152 A2 WO9808152 A2 WO 9808152A2
Authority
WO
WIPO (PCT)
Prior art keywords
computer
scene
model
real
camera
Prior art date
Application number
PCT/US1996/013603
Other languages
English (en)
Other versions
WO1998008152A3 (fr
Inventor
John Ellenby
Thomas Ellenby
Peter Ellenby
Original Assignee
Criticom Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Criticom Corporation filed Critical Criticom Corporation
Priority to PCT/US1996/013603 priority Critical patent/WO1998008152A2/fr
Priority to AU69003/96A priority patent/AU6900396A/en
Publication of WO1998008152A2 publication Critical patent/WO1998008152A2/fr
Publication of WO1998008152A3 publication Critical patent/WO1998008152A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • This instant invention is generally concerned with computer vision systems and specifically concerned with computer vision systems combined with computer aided design facilities
  • Computer aided design and modeling techniques are useful in various fields Commonly referred to as CAD or computer aided design, an electronic computer can be used to model devices, objects or environments. Changes to the model are easily made and a great number of variations might be tried before arriving at a final desired result. Examples of uses of CAD include' by architects for buildings proposed to be built, or by automobile designers for cars proposed to be manufactured A designer may start from a conceptual idea, a photograph, an artist drawing or other medium. From that initial idea or representation, a designer may construct on a computer, a model. A model consists of a plurality of individual graphical objects each having some correspondence to some real object.
  • a photograph suggests to the CAD designer how particular features of the subject may look and give hints to how they might be implemented in a graphical representation.
  • advanced computers it is even possible for the photograph to be scanned into an electronic image and combined with the CAD drawing, thereby facilitating formulation of a more accurate model. Converting real world objects into a computer model is of great interest to CAD designers as it provides very sophisticated designs in early stages of development.
  • Novel techniques have been discovered which provide very specialized uses of vision systems, and in particular as they may relate to computer aided modeling and design. While the systems and inventions of the prior art are designed to achieve particular goals and objectives, some of those being no less than remarkable, these inventions have limitations which prevent their use in new ways now possible. These prior art inventions are not used and cannot be used to realize the advantages and objectives of the present invention.
  • an invention of a vision system computer modeling apparatus including devices for creating a computer model of a real scene. It is a primary function of this system to provide means and method for computer modeling. It is a contrast to prior methods and devices that present systems do not interact with real scenes with respect to perspective and spatial relationships as measured in real-time. A fundamental difference between the computer modeling apparatus of the instant invention and those of the art can be found when considering its interaction with the scene being modeled.
  • An electronic imaging system combined with a computer aided design system and in communication with position, attitude, and optionally range measuring devices forms a vision system computer modeling apparatus.
  • This vision system computer modeling apparatus is operable for addressing a real scene from some point-of-view, or perspective.
  • the model presented at a display as a two-dimensional figure has associated therewith a perspective which relates to a perspective of a three-dimensional real scene.
  • the model as presented to a user is continuously responsive to changes in perspective.
  • the imaging system is moved such that the perspective of the scene changes, the perspective of the model similarly and correspondingly changes as well. This is accomplished via measurements of position and attitude of the apparatus with respect to the scene being addressed.
  • a designer may view a scene in real time and construct a model of the scene as it is viewed.
  • Computer graphical objects can be created and associated with attributes of the real scene.
  • the graphic objects can be indexed to the real position and orientation of objects they represent. As the camera is moved, so can the appearance of the graphical objects in the model in a fashion which corresponds to the way real objects would be imaged from the new camera position.
  • the invention thus stands in contrast to methods and devices known previously.
  • the invention is a computer modeling apparatus with live real-time images and position and attitude measurement devices which relate the computer model to the scene being modeled in proper perspective for all relative positions of the apparatus with respect to the scene.
  • Presently CAD systems are typically computer workstations with the ability to scan photographs and other graphics which may relate to a scene being modeled.
  • Figure 1 is line drawing of objects having a certain spatial relationship
  • Figure 2 is a different view of the same scene
  • Figure 3 shows a close-up and normal view at once
  • Figure 4 shows a resizing operation
  • Figure 5 shows a scene space geometry
  • Figure 6 shows a top view of that same geometry
  • Figures 7 and 8 illustrate line elements generated from various positions
  • Figure 9, 10, 11, and 12 show the side of a building being modeled
  • Figure 13 is a flow diagram for a modeling method
  • Figure 14 is a system block diagram.
  • an apparatus for creating a computer model of a scene where the model is responsive to the position and attitude of the apparatus.
  • the apparatus of one preferred embodiment may be different than the apparatus of another embodiment.
  • Many alternatives and versions may exist for various components of these systems. Adoption of certain variations can be made without deviation from the true spirit of the invention.
  • the scope of the following disclosure should not be limited to the example elements chosen here for illustration.
  • the scope of the invention is set forth by the appended claims. The following examples are considered by the inventors to be the best of many possible modes of executing the invention.
  • a camera may be combined: 1) a camera; 2) a computer, and 3) position and attitude measuring devices.
  • Real-time electronic digital video is available by way of common CCD type electronic cameras.
  • Hand held camcorders can be battery powered and highly portable. They allow a user to view some real scene at a display device, for example a liquid crystal display device.
  • a display device for example a liquid crystal display device.
  • digital signal they produce is compatible with computer processing operations.
  • Computers use similar liquid crystal displays for user interface.
  • the images computers generate therefore are compatible with presentation on those devices. Therefore the combination of electronic cameras and computers is facilitated by the common display format.
  • Very advanced design software is now available having graphical modeling capabilities.
  • CAD methodology permits graphical objects to be manipulated in a great plurality of ways so that they may represent real objects in a computer model.
  • the present invention therefore can most simply be described as the combination of a real-time electronic camera with a computer running CAD type modeling software.
  • a user may address some real scene by pointing the camera thereat from a fixed position.
  • a computer is connected to the camera and is running CAD software, a user may create a model of the real scene as it is viewed.
  • the system computer is preferably arranged to be in communication with position and attitude determining devices. If the device is moved, then the perspective of the real scene necessarily changes. To account for this change in the model, the computer mathematically determines a new perspective for the objects of the model by applying rotation, scale, translation algorithms which are responsive to the camera's new position such that the perspective of the model continuously updates in accordance with the true perspective of the real scene.
  • a camera is addressing a scene, Fig 1, of a box (11) five meters in front of the camera and a ball (12) five meters further in the same direction.
  • the apparatus would have sufficient data to recall a model of a cube (33) and a sphere (34) to match to the box (31) and the ball (32).
  • the recalled models would then be scaled and oriented by the user or the apparatus itself, and placed so that the models (43, 44) coincided with the real objects (41, 42) as is shown in Figure 4
  • the computer For the computer to properly size the two objects it may be required to re-size "stretch" the objects, either manually or automatically via image recognition for example, to fit the new size of the corresponding real box and ball
  • the computer could estimate via a triangulation routine the relative positions with respect to the measured camera displacement Additional camera displacements could result in automatic graphic object updates
  • the first re-sizing operation would calibrate the device and further adjustments may be calculated and the re-sizing of the model objects performed without further user input.
  • An alternate apparatus of the invention may comprise: 1) a camera; 2) a computer; 3) position and attitude measuring devices, and 4) a range finding device.
  • the addition of a ranging capability enhances the ability of the apparatus to create three dimensional models of the scene being addressed.
  • To calculate the position of a point in the scene the user would range on the point of interest and the unit, given position and attitude data, could calculate the coordinates of the point directly It is important to realize that the invention does not propose a new method of position calculation based on unit position and attitude, and range to object. This method has been in use, in radar directed gunnery for example, for many years, though not for developing of computer graphical models.
  • range based position calculation enables the user to quickly develop a model based on nodes and connecting lines and curves, much as in CAD type programs.
  • range based position calculation enables the unit to use advanced modeling methods based on image processing such as edge detection within a defined area and coplanar modeling.
  • a user of the apparatus may choose to use a method of coplanar modeling, Figure 13.
  • a user would range to sufficient points, at least three are required to define a plane but the user could define as many points as are needed to define the area of interest, to define the boundaries of a planar area 1302. These boundaries may or may not be part of the model.
  • the user may also define areas within these edge boundaries that the unit is to ignore.
  • the user may cut pieces out of the defined area that will not be analyzed by the unit, in effect setting up interior boundaries. This would enable the user to define exactly the area that is to be analyzed by the unit.
  • the unit then calculated the equation defining the plane 1303.
  • the unit detects, using edge detection and other methods known, all lines and curves within the defined area 1304 and displays them to the user 1305.
  • the user then deletes all lines that are not to be modeled 1306.
  • Figures 9-12 illustrate the basic steps involved in relation to modeling a wall of an office building.
  • the user defines the planar area by ranging to vertexes 1-6.
  • the user confirms the process by using an interactive graphical user interface.
  • the unit displays the calculated model to the user.
  • a continuation of the coplanar method would be to use a pair of planar models, created from different positions, using the same boundary limits.
  • Figure 5 shows a general situation in which a user defines a planar area 51 that has a line 52 sticking out of it at right angles. The viewer would go through the planar modeling process from viewing position #1 53 and from viewing position #2 54 to generate the coordinates for node A 55 within the plane and to calculate the coordinates of the endpoint, node B 56, without the user having to range to either point specifically. In this simple situation it would seem easier to just range on the two points in question, nodes A & B, and have done with it.
  • Figure 6 gives a plan view of the situation and clearly shows that, if node B 62 is assumed to be in the defined planar area 61 , that the coordinates calculated for the intersection of a line, from the two viewing positions 63,64 to node B 62, and the plane 61 will produce different coordinates, nodes Bl 66 and B2 65. In the case of node A 61 the change in viewing position will not produce a change in coordinates because node A 61 is indeed in the defined plane.
  • Figures 7 and 8 These coordinates, as calculated from each viewing position, are shown in Figures 7 and 8. These figures clearly shows that node A 71, 81 is within the plane. Figures 7 and 8 also show, through the discrepancies in the positions of modes Bl 72 and B2 82, that the real position of node B is not within the defined plane. To calculate the actual coordinates of node B is quite simple. Given the knowledge of the location of each viewing position and the knowledge of the bearings, both horizontal and vertical, to the node, based on the pixel counting angular offset method, from each viewing position the unit could calculate the actual position of the node by triangulation.
  • the primary elements of the systems are: a camera to acquire a digital image of a real scene; a computer to run CAD type software and combine graphical objects with the real image, and further to compute perspective relationships between a real 3-D scene and a 2-D perspective of that scene, the computing routines being responsive to position, attitude and ranging measurements; and apparatus operable for measuring position, attitude, range.
  • the camera may be a camcorder type, a portable battery operated device for producing a digital video signal, in communication with a computer processor
  • the computer may be the type known as a personal computer: having a microprocessor for example Pentium or Power PC processors which have been employed to execute code developed for computer modeling, further being operable for combining an image from an electronic camcorder with a computer generated graphical model image, further being in communication with and responsive to apparatus for measuring position, attitude and range of the camera.
  • the apparatus for measuring position, attitude and range may be as follows.
  • the position measuring apparatus may be a large scale system such as the global position system G P S. or may be a local system arranged with a particular environment in mind such as a city block or single room of a building, each employing some radio location measures, depending upon desired accuracy, the choice of appropriate position measuring system will vary;
  • the attitude measuring apparatus may be an electronic compass, flux gate compass, interferrometer ring gyro or other device which is operable for measuring relative pointing direction of the camera imaging axis;
  • the ranging apparatus may be an ultrasonic or radar device which measures relative position of object by timing a test pulse which is reflected therefrom.
  • ranging such as laser or infrared rangefinders
  • laser or infrared rangefinders may be equally effective as long as it is possible to determine the distance from the camera to a particular object and relay that measurement to the computer.
  • a viewing path is defined by a reticule system which provides reference or "pointing" axis for the devices.
  • "pointing" and “clicking” may be done via a mouse peripheral device.
  • An icon for example an arrow, designates a position on the model field as displayed on the screen.
  • Objects of the model are manipulated by pointing and clicking, clicking and dropping, etc.
  • the model may relate to some real scene, but is not generally responsive thereto.
  • pointing, clicking and dragging operations may be implemented by manipulation of a pointing axis into a real scene. Clicking is accomplished by a simple switch on the device sensitive to tactile manipulation. Again, the system's ability to know its position and pointing attitude as well as certain range data allows it to directly interface with the real world.
  • the model is sensitive to and responsive to the position and attitude of device in the 3-D scene. The displayed model, therefore, corresponds to how the scene would look from any perspective.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système visionique destiné à la réalisation de maquettes d'un environnement ou d'une scène. En l'occurrence, un utilisateur observe une vue via une caméra électronique et le système lui propose des fonctions de type de la conception assistée par ordinateur pour créer des représentations graphiques de la vue. Etant donné que l'appareil possède la connaissance de l'endroit où il se trouve par rapport aux objets de la scène, et par rapport à l'emplacement où il s'est déjà trouvé, il est possible de mettre à jour de façon continue la perspective et la taille de la maquette pour être un reflet de la vue de la scène telle qu'elle se présente à la caméra en n'importe quel point de l'espace où elle se trouve.
PCT/US1996/013603 1996-08-23 1996-08-23 Appareil de production de maquettes par systeme visionique WO1998008152A2 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/US1996/013603 WO1998008152A2 (fr) 1996-08-23 1996-08-23 Appareil de production de maquettes par systeme visionique
AU69003/96A AU6900396A (en) 1996-08-23 1996-08-23 Computer vision system modeling apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US1996/013603 WO1998008152A2 (fr) 1996-08-23 1996-08-23 Appareil de production de maquettes par systeme visionique

Publications (2)

Publication Number Publication Date
WO1998008152A2 true WO1998008152A2 (fr) 1998-02-26
WO1998008152A3 WO1998008152A3 (fr) 1998-03-26

Family

ID=22255653

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1996/013603 WO1998008152A2 (fr) 1996-08-23 1996-08-23 Appareil de production de maquettes par systeme visionique

Country Status (2)

Country Link
AU (1) AU6900396A (fr)
WO (1) WO1998008152A2 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4106218A (en) * 1975-06-25 1978-08-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Simulator method and apparatus for practicing the mating of an observer-controlled object with a target
US4672438A (en) * 1985-06-28 1987-06-09 Her Majesty The Queen In Right Of Canada Tracking simulator
US5184295A (en) * 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4106218A (en) * 1975-06-25 1978-08-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Simulator method and apparatus for practicing the mating of an observer-controlled object with a target
US4672438A (en) * 1985-06-28 1987-06-09 Her Majesty The Queen In Right Of Canada Tracking simulator
US5184295A (en) * 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills

Also Published As

Publication number Publication date
AU6900396A (en) 1998-03-06
WO1998008152A3 (fr) 1998-03-26

Similar Documents

Publication Publication Date Title
US6535210B1 (en) Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time
US6081273A (en) Method and system for building three-dimensional object models
Guillou et al. Using vanishing points for camera calibration and coarse 3D reconstruction from a single image
US7663649B2 (en) Information processing device and method for aiding control operations relating to controlling the position and orientation of a virtual object and method for changing the positioning and orientation of a virtual object
Allen et al. Seeing into the past: Creating a 3D modeling pipeline for archaeological visualization
JP6057298B2 (ja) 迅速な3dモデリング
Baillot et al. Authoring of physical models using mobile computers
US20120120113A1 (en) Method and apparatus for visualizing 2D product images integrated in a real-world environment
Sequeira et al. Automated 3D reconstruction of interiors with multiple scan views
EP0782100A2 (fr) Appareil et méthode d'extraction de formes tridimensionelles
Gomez-Jauregui et al. Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM
JP2010109783A (ja) 電子カメラ
JP2022097699A (ja) 入力装置および入力装置の入力方法、ならびに、出力装置および出力装置の出力方法
CN110084878A (zh) 用于在真实环境中表示虚拟信息的方法
Lin et al. Development of a virtual reality GIS using stereo vision
EP3330928A1 (fr) Dispositif de génération d'image, système de génération d'image et procédé de génération d'image
Grussenmeyer et al. 4.1 ARCHITECTURAL PHOTOGRAMMETRY
Streilein Towards automation in architectural photogrammetry: CAD-based 3D-feature extraction
Bunnun et al. OutlinAR: an assisted interactive model building system with reduced computational effort
Luo et al. An Internet-enabled image-and model-based virtual machining system
Dias et al. 3D reconstruction of real world scenes using a low‐cost 3D range scanner
Radanovic et al. Aligning the real and the virtual world: Mixed reality localisation using learning-based 3D–3D model registration
JP2004272515A (ja) インタフェース方法、装置、およびプログラム
CN111127661B (zh) 一种数据处理方法、装置及电子设备
El-Hakim et al. Two 3D Sensors for Environment Modeling and Virtual Reality: Calibration and Multi-View Registration

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AU CA CH JP KR NZ US

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

AK Designated states

Kind code of ref document: A3

Designated state(s): AU CA CH JP KR NZ US

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WA Withdrawal of international application
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: CA