US20100042382A1 - Method and Apparatus for Localizing and Mapping the Position of a Set of Points on a Digital Model - Google Patents
Method and Apparatus for Localizing and Mapping the Position of a Set of Points on a Digital Model Download PDFInfo
- Publication number
- US20100042382A1 US20100042382A1 US12/603,276 US60327609A US2010042382A1 US 20100042382 A1 US20100042382 A1 US 20100042382A1 US 60327609 A US60327609 A US 60327609A US 2010042382 A1 US2010042382 A1 US 2010042382A1
- Authority
- US
- United States
- Prior art keywords
- sensor unit
- digital model
- physical
- points
- physical object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0006—Industrial image inspection using a design-rule based approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Abstract
A computer implemented method, apparatus and computer usable program product for mapping a position of a physical point on a digital model. The method comprises receiving sensor data for the physical point from a sensor unit and correlating a position of the physical point to the digital model. The computer implemented method calculates an alignment probability for the correlated position and then compares the calculated value to a stored threshold value. If the alignment probability for the correlated position does exceed the predetermined threshold probability, an aligned position is formed. An aligned position of the physical point is stored relative to the digital model, responsive to a store map location command.
Description
- 1. Field
- The present invention relates generally to an improved data processing system and in particular to mapping the position of a point. Still more particularly, the present invention relates to a computer implemented method, apparatus, and computer usable program product for sensing the position of a set of points relative to a physical object and correlating the position of the set of points to a digital model of the physical object.
- 2. Background
- An example of a point is a defect on a manufactured part. A set of points is one or more points. A set of points may, for example, form a circle, an ellipsis, a line, and/or a three-dimensional shape. As used herein, a defect is a term used in manufacturing to denote an imperfection in a part under manufacture. Engineers study defects to correct the current defective part. The defects are also studied to understand and prevent further defective parts through identifying defect trends over time. To effectively find and correct the cause of defects, engineers may need to identify the location of a given defect.
- Defect mapping is a method of localizing and communicating the position of a defect in a defect report. Defect position information may take the form of a simple “x” or other marking on a drawing, or other pictorial representation of the part. Currently, a user may make an imprecise identification of the location of the defect on the actual physical part, and correlate that location to a paper drawing to identify the location of the defect in the paper drawing. However, many times the paper drawing of the part will not show a view of the part necessary to localize the defect. Also, the scale of the “x” marking the defect location on the paper drawing may be such that a large area of the physical object is searched to find a small defect. In addition, indicating a location of a defect through marking a drawing by hand may provide incomplete information regarding the location of a defect.
- In other scenarios, the defect location may be logged into an existing database by a user, who may measure, estimate, or omit the three-dimensional coordinates X, Y, and Z that define the position of the defect. These processes are imprecise, inaccurate, and burdensome to users.
- Therefore, it would be advantageous to have an improved method, apparatus, and computer usable program code for identifying defects on a physical object.
- Advantageous embodiments of the present invention provide a computer implemented method, apparatus, and computer usable program product for mapping a set of physical points to a digital model. The computer implemented method for locating defects comprises receiving sensor data for a defect on a manufactured object, correlating the location of the defect on the manufactured object to a location on a digital model of the manufactured object, and storing the sensor data for the defect in a database in association with location on the digital model.
- In another embodiment, a process receives sensor data for a set of physical points from a sensor unit and correlates a position of the set of physical points to a digital model. The process calculates an alignment probability for the correlated position and compares the calculated value to a stored threshold value. If the alignment probability for the correlated position exceeds the predetermined threshold probability, an aligned position is formed. An aligned position of the physical point is stored relative to the digital model, responsive to a store map location command.
- In another illustrative embodiment, a computer program product having computer usable program code encompasses the steps for mapping a set of physical points to a digital model. The computer program product is executed to perform the steps a computer implemented method for mapping a set of physical points to a digital model. The computer implemented method comprises receiving sensor data for the set of physical points from a sensor unit to form received sensor data, and aligning the received sensor data from the sensor unit to a set of place-identifiers in the digital model, wherein a position of the set of physical points is correlated to the digital model to form a correlated position.
- In yet another illustrative embodiment, an apparatus comprises the sensors and a data processing system for performing the steps for mapping a set of physical points to a digital model.
- The features, functions, and advantages can be achieved independently in various embodiments of the disclosure, or may be combined in yet other embodiments.
- The novel features believed characteristic of the invention are set forth in the appended claims. The invention itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an advantageous embodiment of the present invention when read in conjunction with the accompanying drawings, wherein:
-
FIG. 1 is a diagram of a point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 2 is a block diagram of components for a point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 3 is a block diagram of a data processing system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 4 is a diagram of a compact unit for a point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 5 is a diagram of an exemplary compact unit for a point mapping system in accordance with another illustration of the advantageous embodiment of the present invention; -
FIG. 6 is a block diagram of a digital model for a point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 7 is a block diagram illustrating components in a correlation module for a point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; -
FIG. 8 is a high level localization flowchart of a point mapping system in accordance with an illustration of the advantageous embodiments of the present invention; -
FIG. 9 is a high level flowchart illustrating the process flow for a simultaneous localization and mapping module of the point mapping system in accordance with an illustration of the advantageous embodiment of the present invention; and -
FIG. 10 is a flowchart of the save map position process for the point mapping process in accordance with an illustration of the advantageous embodiment of the present invention. - With reference now to the figures, and in particular with reference to
FIG. 1 , a diagram of a point mapping system in accordance with an advantageous embodiment of the present invention. It should be appreciated thatFIG. 1 is only exemplary and is not intended to assert or imply any limitation with regard to the environment in which different embodiments may be implemented. - Factory parts may have digital models stored in a data processing system, typically, a computer aided design (CAD) system. Computer aided design is a dominant geometry-authoring tool and involves both software, and in some instances, special purpose hardware. Current computer aided design systems create digital models that range from two-dimensional (2D) vector based drafting systems to three-dimensional (3D) solid and surface modelers.
- A digital model is a multi-dimensional geometric object description. The digital format of the model includes a description of the object using coordinates and numerical descriptions of the position of the object in a multi-dimensional space. This format is typically used in a computer aided drawing system. Digital models may be suitable for documentation and visualization, as well as for complex simulations, manipulations, and analysis of the object.
- A digital model may use place-identifiers to denote features on or in a physical object. A place-identifier is a feature on the physical object that serves as a guide in location, such as the edge of a fuel tank or a series of fasteners, for example, associated with an airplane object. An airplane object is an airplane or any part of an airplane. The digital model is comprised of a plurality of place-identifiers, each with unique coordinates within the digital model.
- In some instances, digital models are implemented in the design phase of a manufactured object or part. The digital models may be used to create castings or molds for the manufactured part. Digital models may also be used to provide a three-dimensional map of the part.
- An implementation of an advantageous embodiment of the present invention allows a user to aim a sensor unit at a set of points on a manufactured object and correlate the set of points to a digital map of the manufactured object. Therefore, upon recalling the digital map, a user may see accurately and precisely where the set of points is located. This feature of the advantageous embodiment is well suited to track defects on manufactured objects.
- The different embodiments recognize that current defect position techniques do not provide a method for sensing the position of a set of points on a physical object and correlating the position of the set of points to the digital model of the physical object. As users build and assemble parts or perform maintenance and repairs on parts, defects may be observed. One of the advantageous embodiments of the present invention provides a computer implemented method, apparatus, and computer program product for mapping the defects observed.
-
FIG. 1 showsphysical object 102.Physical object 102 may be any physical object, including parts under manufacture or repair. For example, the advantageous embodiments may be implemented on parts that have both internal and external surfaces, and/or expansive parts with minimal features. An internal surface may include, for example, the inside of a fuel tank or wing box for an airplane wing. Parts may also have featureless expanses. A featureless expanse is a portion of a part that does not have place-identifiers. For example, a section of sheet metal used for a wing component on an airplane may include a featureless expanse along the outer surface of the wing. - An advantageous embodiment provides
point mapping system 104 equipped withsensor unit 106 to scan a defect on a part, such asphysical object 102. In this example, a user, such as a factory operator that observes a defect, carriessensor unit 106 to the defect on the part under assembly. If the defect is observed in the inside of a fuel tank, for example,sensor unit 106 may be taken inside the fuel tank. The operator may press a button onsensor unit 106 to activate a store map location command when the operator perceives that the defect is sensed bysensor unit 106. A laser pointer indicating the sensing position may optionally be included.Point mapping system 104 also may be mounted on a robot that scansphysical object 102 for defects.Point mapping system 104 may containdata processing system 108 and/ordisplay 110 in a compact unit, ordata processing system 108 and/ordisplay 110 may be located remote tosensor unit 106.Digital object 112 is the digital model ofphysical object 102 and may be viewed ondisplay 110. -
Physical object 102 is marked with an “x”icon denoting defect 114 corresponding tolocation L 116 onphysical object 102.Defect 114 is located on an internal surface of the part.Defect 118 is marked with an “O” icon and is located on an external surface ofphysical object 102, in a featureless area of the part at location L′ 120. The “O” icon illustrates that the defect is not a single point but a set of points.Sensor unit 106 may collect sensor data fromlocation L 116 and location L′ 120, anddata processing system 108 may match the sensor data todigital object 112.Digital object 112 is a digital model ofphysical object 102. The digital depiction of the defect is indicated indigital object 112 with an “x”icon 122 and an “O”icon 124. Thus, the physical defect on the part is mapped to the digital model of the part. - Note that digital location L′ 122 is indicated on
digital object 112, which is pictured ondisplay 110. The defect position relative to the digital model of the part is then stored. Thus, the location of the defect,location L 116, has been mapped onto the three-dimensional model ofdigital object 112. -
Digital object 112, withdefect locations Locations - An advantageous embodiment of the present invention envisions the compilations of defect maps that aid in overall defect reduction. Throughout the examples herein, a factory environment is used to describe the advantageous embodiments of the present invention. Those of ordinary skill in the art will appreciate that the scope of the present invention is not limited by these examples and there are many applications for the advantageous embodiments of the present invention.
-
FIG. 2 is a block diagram of components of a defect mapping system in accordance with an advantageous embodiment of the present invention.Physical object 200 is a physical object, such asphysical object 102 inFIG. 1 , with a point at location L, such aslocation L 116 inFIG. 1 .Physical object 200 may be a part under assembly, and the point may be a defect as in the above example.Point mapping system 202 comprisessensor unit 204, which collects sensor data forlocation L 206. -
Sensor unit 204 includes an image-collecting unit such as, for example, a digital video camera, a laser triangulating system, or acoustic location technology, for example sonar. A video camera, as used herein, applies to image sensors of all frequencies of electromagnetic radiation. A set of video cameras means one or more video cameras. Laser triangulation is the process of finding coordinates and distances to a point by calculating the length of one side of a triangle, and by finding the measurements of angles and sides of the triangle formed by that point and two other known reference points, using laser technology. A laser, an acronym derived from light amplification by stimulated emission of radiation, is an optical source that emits photons in a coherent beam. Those of ordinary skill in the art, after careful reading of this specification, will appreciate how a laser application may be applied in context herein, and therefore, laser technology will not be discussed further. Sonar is a technique that uses sound propagation to map objects. -
Sensor unit 204 may be implemented as a compact unit that is handheld, such assensor unit 106 inFIG. 1 . An image sensor may employ, for example, a charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) technology to sense images. However, any image sensor is within the scope of the advantages embodiments of the present invention. An image sensor may include a microprocessor to process raw image data and communicate withcommunications unit 214. -
Sensor unit 204 may optionally include an accelerometer to track the distance and direction thatsensor unit 204 travels. An accelerometer is a device for measuring acceleration. An accelerometer inherently measures its own motion, in contrast to a device based on remote sensing, such as a global positioning system (GPS). Therefore, an accelerometer may be used for tracking the position ofsensor unit 204, and providing data concerning direction and distance of travel in environments unsuitable for remote sensing. The interior portions of a manufactured part are one such example of this type of environment. - Therefore,
sensor data 206 may include a stream of video, or sampling of digital images, and in addition, the distance and direction of travel data from the accelerometer.Point mapping system 202 also comprises one or moredata processing systems 208.Data processing system 208 may be embodied in one system, ordata processing system 208 may be a plurality of interconnected data processing systems.Data processing system 208 comprisescorrelation module 210, which localizes the digital position data oflocation L 212, and communicates the digital position data oflocation L 212 throughcommunications unit 214 to permanent storage for the location L in digitaldefect position storage 216.Correlation module 210 correlates the physical point withdigital object 218. -
Correlation module 210 includeslocalization module 220.Correlation module 210 may optionally include simultaneous localization andmapping module 222. Simultaneous localization andmapping module 222 may be optionally included to map point defects, such asdefect 118 inFIG. 1 .Defect 118 is located in a relatively featureless position on the part, in which there are few place-identifiers. Simultaneous localization andmapping module 222 maps the free space portion of the digital image, usinglocalization module 220 to confirm localization of points. In other words, if a sensor unit, such assensor unit 204, senses a defect on a featureless portion of the part, simultaneous localization andmapping module 222 may map a track to the location of the defect and thus, correlate the defect location to the digital model. While simultaneous localization andmapping module 222 is capable of continuous localization and mapping, the module is optional and may not be a necessary option unless there are no place-identifiers available for location correlation. There are several simultaneous localization and mapping modules available for use in simultaneous localization andmapping module 222, and the advantageous embodiments of the present invention are not limited to a specific module. - In one embodiment,
point mapping system 202 is a compact unit, which may be handheld containingsensor unit 204, a detached data processing system capable of correlating the position of the point to a digital model, or a display that communicates with the user ofpoint mapping system 202. - In another embodiment,
point mapping system 202 may comprise a compact unit that containssensor unit 204 andcommunications unit 214.Communications unit 214 may then communicate data to a unit or units with additional data processing capabilities. Other embodiments ofpoint mapping system 202 may not include compact units and/or may be remotely controlled sensor units. - Turning now to
FIG. 3 , a diagram ofdata processing system 300 is depicted in accordance with an advantageous embodiment of the present invention.Data processing system 300 is an example of a data processing system that may be used, such asdata processing system 208 inFIG. 2 . In this illustrative example,data processing system 300 includescommunications fabric 302, which provides communication betweenprocessor unit 304,memory 306,persistent storage 308,communications unit 310, I/O unit 312, anddisplay 314. -
Processor unit 304 serves to execute instructions for software that may be loaded intomemory 306, for example, software to implement a correlation module, such ascorrelation module 210 inFIG. 2 .Processor unit 304 may be a set of one or more processors, or may be a multi-processor core, depending on the particular implementation. Further,processor unit 304 may be implemented using one or more heterogeneous processor systems in which a main processor is present with secondary processors on a single chip.Memory 306, in these examples, may be, for example, a random access memory (RAM).Persistent storage 308 may take various forms depending on the particular implementation. For example,persistent storage 308 may be a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above, and may provide database storage. -
Communications unit 310, in these examples, provides for communication with other data processing systems or devices.Communications unit 310 may be a network interface card. I/O unit 312 allows for input and output of data with other devices that may be connected todata processing system 300. For example, I/O unit 312 may provide a connection for user input though a keyboard and a mouse. Further, I/O unit 312 may send output to a printer.Display 314 provides a mechanism to display information to a user. - Instructions for the operating system and applications or programs are located on
persistent storage 308. These instructions may be loaded intomemory 306 for execution byprocessor unit 304. The processes of the different embodiments may be performed byprocessor unit 304 using computer implemented instructions, which may be located in a memory, such asmemory 306. -
FIG. 4 is a diagram of an exemplary compact unit for a point mapping system in accordance with an advantageous embodiment of the present invention.Point mapping system 400 is an embodiment ofpoint mapping system 202 inFIG. 2 . In this example,point mapping system 400 is a handheld unit. A sensor unit, such assensor unit 204 inFIG. 2 , is enclosed inouter casing 402. On/offswitch 404 controlspower supply 406.Power supply 406 may be any type of known or available power supply, including but not limited to, batteries.Power supply 406 provides power for the operation ofimage sensor 408,accelerometer 410, which is optional,communication unit 412,indicator light 414, and storemap location button 416.Camera lens 418 is attached to imagesensor 408.Camera lens 418 extends to an opening inouter casing 402. -
Accelerometer 410 may consist of a suspended cantilever beam or proof of mass with deflection sensing and circuitry. Proof of mass is also referred to as seismic mass. For instance,accelerometer 410 may be a three axis, micro electromechanical system (MEMS) device. As another example,accelerometer 410 may be a laser accelerometer.Accelerometer 410 may also contain a microprocessor for processing accelerometer data and communicating withcommunication unit 412.Accelerometer 410 aids in locating the position of the compact unit.Accelerometer 410 produces directional and distance measurement data. -
Communication unit 412 provides for communication withdata processing system 420 or other devices, and may contain a microprocessor and antenna for wireless communication.Communication unit 412 is an example of an implantation ofcommunications unit 310 inFIG. 3 . Another embodiment envisions the defect mapping system implemented using wired connections.Communication unit 412 communicates image and accelerometer data todata processing system 420 for analysis.Communication unit 412 may also receive information in the form of an alignment alarm fromdata processing system 420, which causes indicator light 414 to illuminate. Storemap location button 416 issues the command todata processing system 420 to save the currently indicated location to permanent storage throughcommunication unit 412. - Referring now to
FIG. 5 , another embodiment of a compact unit, which may be handheld, for a point mapping system is shown in accordance with the advantageous embodiments of the present invention.Point mapping system 500 is similar topoint mapping system 400 inFIG. 4 in that it comprisesouter casing 502, on/offswitch 504,power supply 506,image sensor 508,accelerometer 510 andcamera lens 514. However,point mapping system 500 envisionsdata processing system 516 integrated withinouter casing 502 ofpoint mapping system 500.Display 518 is the user interface in these examples and may, for example, be a touch screen system. The user may receive information and instructions fromdata processing system 516.Point mapping system 500 also hasoptional laser pointer 520 withlaser lens 522 extending fromouter casing 502.Laser pointer 520 aids the user in orientingimage sensor 508. As the operator is scanning the object,laser pointer 520 illuminates the subject area of the scan with a point of light. Other advantageous embodiments are within the scope of the present invention. -
FIG. 6 depicts a block diagram of a digital object, such asdigital object 112 inFIG. 1 , in accordance with the advantageous embodiments of the present invention.Digital object 602, such asdigital object 218 inFIG. 2 , is a representation of the surfaces of a part, a fixture, or other object of interest in a digital format. The surfaces may be external or internal to the part. -
Digital object 602 contains place-identifiers 604. Place-identifiers 604 are landmarks on the part. A landmark is a prominent or conspicuous feature on an object that serves as a guide. In other words, place-identifiers 604 are the visual clues as to a location on the digital model.Place identifiers 604 correlate tophysical object 606.Physical object 606 may be a manufactured part, such asphysical object 102 inFIG. 1 . Features may include, but are not limited to, edge 608,seam 610,bolt 612, andcorner 614, and may also include such features as hinges, fasteners, and markings. Fixtures holding the part may also be included in the digital model and may be used for place-identifiers. The remaining portion of the digital model may be thought of asfree space 616 within a three-dimensional grid.Free space 616 onphysical object 606 may, for example, correlate tofront surface 618, which has few distinguishing place-identifiers. -
FIG. 7 is a block diagram of components of a correlation module in accordance with the advantageous embodiments of the present invention.Correlation module 700, such ascorrelation module 210 inFIG. 2 , compriseslocalizer module 702 and optional simultaneous localization andmapping module 726.Localizer module 702 comprises image to place-identifier matching module 704,alignment probability calculator 706,comparator 708, andnew position estimator 710.New position estimator 710 may include arandom pose generator 712, or another method of estimating new positions.Localizer module 702 receivesimage data 714 from an image sensor, such asimage sensor 408 inFIG. 4 , andaccelerometer data 716 from an accelerometer, such asaccelerometer 410 inFIG. 4 , as input. In these examples, output data fromcorrelation module 700 is either anunaligned position indication 718 or the digital position oflocation L 720. - Image to place-
identifier matching module 704 correlatesimage data 714 to the stored place-identifiers 722 from the digital model to determine a match. A match betweenimage data 714 and place-identifiers 722 locates the image on the digital model. There may be instances in which more than one place-identifier 722 correlates withimage data 714. -
Alignment probability calculator 706 calculates the alignment probability that a correlated image to place-identifier is a match for each of the place-identifiers 722 that are correlated. The highest alignment probability calculated is compared to a predefinedalignment probability threshold 724, and compared incomparator 708. -
Accelerometer data 716 is used bynew position estimator 710.New position estimator 710 may take the last known position of the sensor unit, analyze the distance and directional data from the accelerometer, and estimate a new position. Place-identifiers 722 for the new estimated position are correlated to imagedata 714 in image to place-identifier matching module 704. -
New position estimator 710 may first analyze the global topology of the object and generate a distribution of three-dimensional coordinate triplets for the current pose of the image data. Pose is the orientation of the image in respect to the physical object. A pose defines a three-dimensional position in a three-dimensional computer aided drafting model, as well as an orientation at that position. Therefore, pose data may include X, Y, and Z coordinates, as well as other parameters such as roll, pitch, and yaw. To define roll, pitch and yaw, three lines may be imagined running through an object and intersecting at right angles at the center of the object: rotation around the front-to-back axis is called roll; rotation around the side-to-side axis is called pitch; and rotation around the vertical axis is called yaw. - Using the generated set of three-dimensional points, the
new position estimator 710 considers the local topology to produce a generalized curve representing an estimate of the image position. Thus, the resultant estimated position is not constrained to lie along a straight line from the last known position, but may lie along a generalized curve. New image data correlates with a higher probability to a particular curve. The less likely curves are then discarded as improbable. - Optionally,
correlation module 700 may include a simultaneous localization andmapping module 726. In such a case, simultaneous localization andmapping module 726 may uselocalizer module 702 for localization data, and continuously map the object as new image data fromphysical object 714 andaccelerometer data 716 become available. - The basic components of simultaneous localization and
mapping module 726 are the recognition of new visual landmarks that may be added to the place-identifiers 722, such asplace identifiers 604 inFIG. 6 , for the digital object. Each of these landmarks is associated with an image pose and is given unique coordinates. The module estimates measurement from the last place-identifier and assesses the reliability of the measurement. Reliable measurements, as defined by a probability, are then mapped into the digital object and unreliable measurements are discarded.Probability threshold 724 may be used to compare measurements in simultaneous localization andmapping module 726. -
FIG. 8 is a high-level flowchart of a process for a localization module in accordance with the advantageous embodiments of the present invention. The illustrative process inFIG. 8 is implemented by a hardware/software component for collecting image data, such assensor unit 204 inFIG. 2 , and localizing data, such aslocalization module 702 inFIG. 7 . - In this embodiment, the localization module uses an iterative pose generator that takes multiple inputs. One of the inputs is the current pose (operation 802). The current pose may be a known initial position, or the pose as calculated by the last iteration of the localization module. The current pose is an aligned position. An aligned position is one in which the image data from the sensor unit correlates to the place-identifiers in the digital model at a greater than threshold alignment probability. The sensor unit position may be aligned to the digital model because the sensor unit is initiated in a fixed position receptacle, or an initial operation is performed which aligns the sensor unit to the digital model. In other words, the current image from the sensor unit is correlated to the digital model.
- A second input is data from the accelerometer (operation 804). Accelerometer data may be used to determine the values input to a number generator, such as a Gaussian random number generator, that the pose generator uses. The accelerometer will output a measured X, Y, and Z movement, as well as a measured roll, pitch and yaw movement. These measured movements become the mean values input to the pose generator.
- In one embodiment, the image data is checked for changes, indicating if the sensor unit has moved. If the image has not changed since the last image, the process remains aligned. In another embodiment, if the accelerometer data indicates that the sensor unit is at rest, the process remains in an aligned state. In yet another embodiment, continuous images are input and analyzed.
- The pose generator takes an input pose and data from the accelerometer to generate a number of random poses, numbered 1, 2, . . . , j, j+1, . . . , n (operation 806). Next, the localization module determines if any place-identifiers are in the digital model for pose j (operation 808).
- The poses generated are communicated to a comparator (operation 810). The comparator also receives image data input (operation 812). The comparator then calculates a probability of alignment between pose j and the image data input (operation 814). The comparator, such as
comparator 708 inFIG. 7 , then compares the calculated alignment probability to a predetermined threshold alignment probability (operation 816). A predetermined threshold alignment probability is a user-preset value.Comparator operation 816 ensures system administrator control of the quality of the positional alignment. If the probability of alignment exceeds the threshold alignment, then the process is aligned (operation 818). If the probability that the image data aligns with pose j is lower than the threshold probability, the image data is not aligned with the digital object (operation 820). - Optionally, when the probability percentage exceeds threshold, the value n may be reduced, thus reducing the computational needs of the system and allowing the system to process incoming data faster. If the last iteration did not produce a value exceeding the threshold, the value n may be increased to look for more potential poses.
- If the probability percentage exceeds threshold (operation 818), then the localization module stores pose j as the current position, and the probability percentage associated with pose j alignment (operation 822), with the process terminating thereafter.
- If the probability percentage does not exceed threshold (operation 820), then the acquire time counter is incremented (operation 824). Acquire time may be set by a system administrator and is used to allow the localizer system several iterations before indicating an alarm to the user.
- Next, the localizer system determines whether the acquire time has been exceeded (operation 826). If the acquire time has been exceeded, the localizer system indicates an alarm to the user (operation 828). The alarm may be an audio or visual indicator activated to alert the user to the unaligned state of the localizer system. If the acquire time has not been exceeded, the localizer system returns to the beginning of the process with no change to the current position.
- Those of ordinary skill in the art will appreciate that there are known methods and algorithms of estimating a new position or pose for the sensor unit and the image data. The advantageous embodiments of the present invention are not limited by a particular implementation of an algorithm performing an estimated position or pose function.
-
FIG. 9 is a high-level flowchart illustrating the process flow for a simultaneous localization and mapping module of the point mapping system in accordance with an advantageous embodiment of the present invention. The different operations shown inFIG. 9 are implemented by a hardware/software component for sensing a position of a point, such assensor unit 204 inFIG. 2 , andlocalizer module 702 inFIG. 7 . - A simultaneous localization and mapping module map comprises a simultaneous localization and mapping algorithm that is comprised of voxels representing the three-dimensional space. A voxel is a three-dimensional pixel. The word voxel comes from combining a pixel and a volume. In other words, a voxel is a volume of three-dimensional space. The mapping module is working on a matrix of voxels, corresponding to X, Y, and Z coordinates, that represent the intersection of a camera's field-of-view.
- The implementation of the simultaneous localization and mapping process begins with inputting the map from a previous iteration into a map pruning algorithm (operation 902). For efficiency, the map may be pruned so that the implementation only processes data within a certain range of the current position. In this embodiment, the pruning is accomplished by the area of interest detection module, which implements the map pruning algorithm (operation 904). Map pruning may involve copying the area of the map that the image data is likely to modify to a separate, smaller representation of the map that excludes items behind the camera and/or occluded by foreground objects.
- The system inputs the current sensor data (operation 906) into a new map generator. The new map generator generates the new map using three-dimensional data (operation 908). In one embodiment, the three-dimensional new map may be created by two cameras that are separated by a known distance. A process that computes a disparity map from the images of the two cameras may be used to create a depth field from which three-dimensional data may be extracted.
- Similarly, in another embodiment, a single camera and accelerometer may be used. The process may simulate the two-camera process by using the accelerometer data to determine the separation distance between two frames of the image data. Yet another embodiment may use laser triangulation to gather three-dimensional information in conjunction with a camera. Thus, a new map using the current sensor data is formed.
- The new map and the pruned previous map are input into a map combiner module (operation 910). The process then determines if any place-identifiers are within the predicted environment of the sensor unit. In other words, the system will determine if any place-identifiers have a high probability of being represented on the new map (operation 912). For instance, if the accelerometer data indicates that the estimated position of the sensor unit is three meters west of the last aligned position and around an acute angle from the last place-identifier, the process will then model the predicted environment of the location of the sensor unit and determine if there are any place-identifiers within image sensor range of the current estimated position of the sensor unit.
- A weight is then determined for the pruned previous map (operation 914), and a weight is determined for the new map (operation 916). The voxels that represent the free space between voxels that represent the location of place-identifiers will have their value increased. The voxels outside the field-of view and voxels that are behind objects in the field-of-view are not modified. The maps that have a larger number of high value voxels will be weighted higher.
- The weights are parameters that represent two percentages, which sum to 100. A higher new map weight means that the new map includes more localized place-identifiers than the pruned previous map. Further, a higher pruned previous map weight implies that place-identifiers seen in the image data are less likely to alter the updated map.
- The result is an updated map (operation 918) that may be used by a localization module, such as
localizer module 702 inFIG. 7 to confirm alignment (operation 920). Thus, the process ends. - Turning now to
FIG. 10 , a flowchart of a save map position process for the point mapping process is shown in accordance with the advantageous embodiments of the present invention. The different operations shown inFIG. 10 are implemented by a hardware/software component for sensing a position of a point, such assensor unit 204 inFIG. 2 , andcorrelation module 700 inFIG. 7 . The process begins by receiving a command to save a map position (operation 1002). The process determines if the map position is aligned (operation 1004). If the map position is not aligned, an alarm is indicted and the process terminates (operation 1006). If the map position is aligned, the map position is stored (operation 1008), and thus the process terminates. - The flowcharts and block diagrams in the different depicted embodiments illustrate the architecture, functionality, and operation of some possible implementations of apparatus, methods, and computer program products. In this regard, each block in the flowchart or block diagram may represent a module, segment, or portion of code which comprises one or more executable instructions for implementing the specified function or functions. In some alternative implementations, the function or functions noted in the block diagram may occur out of the order noted in the figures. For example, in some cases, two blocks shown in succession may be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- Thus, the advantageous embodiments of the present invention provide a computer implemented method, apparatus, and computer usable program product to map a physical point to a digital model. The different embodiments can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment containing both hardware and software elements. In one embodiment, the features are implemented in software, which includes, but is not limited to, firmware, resident software, and microcode. For example, the features can be implemented in signal processing software modules on general purpose computers, digital signal processing (DSP) chip, field programmable gate arrays (FPA) integrated circuit, and application specific integration circuit (ASIC) chips utilizing firmware programming. For example, the smart averaging filter can be implemented entirely as software, entirely as hardware, or as a combination of hardware and software.
- The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Further, different advantageous embodiments may provide different advantages as compared to other advantageous embodiments. The embodiment or embodiments selected are chosen and described in order to best explain the principles of the invention, the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Claims (24)
1-3. (canceled)
4. A method for mapping a set of physical points to a digital model, the method comprising:
receiving sensor data for the set of physical points from a sensor unit to form received sensor data;
aligning the received sensor data from the sensor unit to a set of place-identifiers in the digital model, wherein a position of the set of physical points is correlated to the digital model to form a correlated position; and
storing the correlated position in a storage.
5-6. (canceled)
7. The method of claim 4 further comprising:
providing a simultaneous localization and mapping module for tracking a location of the set of physical points.
8. The method of claim 4 , wherein the received sensor data comprises data from one or more of a laser triangulation system, an acoustical localizer, an accelerometer, or a set of video cameras of the sensor unit.
9. (canceled)
10. The method of claim 4 , further comprising:
orienting the sensor unit using a laser pointer on the sensor unit.
11. The method of claim 8 , further comprising:
displaying the digital model on an interactive display of the sensor unit.
12. The method of claim 4 , wherein the sensor unit is one of remotely controlled or autonomously controlled.
13-25. (canceled)
26. The method of claim 4 , wherein the physical points are physical points of a physical object, and wherein the digital model comprises a multi-dimensional description of the physical object.
27. The method of claim 26 , wherein the multi-dimensional description of the physical object is a three-dimensional description of the physical object.
28. A computer program product, comprising:
a computer usable storage medium storing computer usable storage code for mapping a set of physical points to a digital model, the computer program product including:
computer usable program code for receiving sensor data for the set of physical points from a sensor unit to form received sensor data;
computer usable program code for aligning the received sensor data from the sensor unit to a set of place-identifiers in the digital model, wherein a position of the set of physical points is correlated to the digital model to form a correlated position; and
computer usable program code for storing the correlated position in a storage.
29. The computer program product of claim 28 , wherein the received sensor data comprises data from one or more of a laser triangulation system, an acoustical localizer, an accelerometer, or a set of video cameras of the sensor unit.
30. The computer program product of claim 29 , further comprising:
computer usable program code for displaying the digital model on an interactive display of the sensor unit.
31. The computer program product of claim 28 , wherein the physical points are physical points of a physical object, and wherein the digital model comprises a multi-dimensional description of the physical object.
32. The computer program product of claim 31 , wherein the multi-dimensional description of the physical object is a three-dimensional description of the physical object.
33. An apparatus, comprising:
a sensor unit;
a digital model;
a bus system;
a communications system connected to the bus system;
a memory connected to the bus system, wherein the memory includes a set of instructions; and
a processing unit connected to the bus system, wherein the processing unit executes the set of instructions to:
receive sensor data for a set of physical points from the sensor unit to form received sensor data;
align the received sensor data from the sensor unit to a set of place-identifiers in the digital model, wherein a position of the set of physical points is correlated to the digital model to form a correlated position; and
store the correlated position in a storage.
34. The apparatus of claim 33 , further comprising:
a simultaneous localization and mapping module for tracking a location of the set of physical points.
35. The apparatus of claim 33 , wherein the received sensor data comprises data from one or more of a laser triangulation system, an acoustical localizer, an accelerometer, or a set of video cameras of the sensor unit.
36. The apparatus of claim 33 , further comprising:
a laser pointer on the sensor unit for orienting the sensor unit.
37. The apparatus of claim 35 , further comprising:
an interactive display of the sensor unit for displaying the digital model.
38. The apparatus of claim 33 , wherein the physical points are physical points of a physical object, and wherein the digital model comprises a multi-dimensional description of the physical object.
39. The apparatus of claim 38 , wherein the multi-dimensional description of the physical object is a three-dimensional description of the physical object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/603,276 US20100042382A1 (en) | 2007-01-23 | 2009-10-21 | Method and Apparatus for Localizing and Mapping the Position of a Set of Points on a Digital Model |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/626,168 US7627447B2 (en) | 2007-01-23 | 2007-01-23 | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
US12/603,276 US20100042382A1 (en) | 2007-01-23 | 2009-10-21 | Method and Apparatus for Localizing and Mapping the Position of a Set of Points on a Digital Model |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/626,168 Continuation US7627447B2 (en) | 2007-01-23 | 2007-01-23 | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100042382A1 true US20100042382A1 (en) | 2010-02-18 |
Family
ID=39144577
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/626,168 Active US7627447B2 (en) | 2007-01-23 | 2007-01-23 | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
US12/603,276 Abandoned US20100042382A1 (en) | 2007-01-23 | 2009-10-21 | Method and Apparatus for Localizing and Mapping the Position of a Set of Points on a Digital Model |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/626,168 Active US7627447B2 (en) | 2007-01-23 | 2007-01-23 | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
Country Status (3)
Country | Link |
---|---|
US (2) | US7627447B2 (en) |
GB (1) | GB2459803B (en) |
WO (1) | WO2008091714A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140297485A1 (en) * | 2013-03-29 | 2014-10-02 | Lexmark International, Inc. | Initial Calibration of Asset To-Be-Tracked |
US20150350378A1 (en) * | 2014-05-28 | 2015-12-03 | Alexander Hertel | Platform for Constructing and Consuming Realm and Object Feature Clouds |
US9511496B2 (en) | 2014-06-20 | 2016-12-06 | The Boeing Company | Robot alignment systems and methods of aligning a robot |
US9862096B2 (en) | 2015-03-30 | 2018-01-09 | The Boeing Company | Automated dynamic manufacturing systems and related methods |
US10102629B1 (en) * | 2015-09-10 | 2018-10-16 | X Development Llc | Defining and/or applying a planar model for object detection and/or pose estimation |
US10209062B1 (en) * | 2014-08-01 | 2019-02-19 | Google Llc | Use of offline algorithm to determine location from previous sensor data when location is requested |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7627447B2 (en) * | 2007-01-23 | 2009-12-01 | The Boeing Company | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
US8060835B2 (en) * | 2007-06-05 | 2011-11-15 | The Boeing Company | Three dimensional defect mapping |
US7859655B2 (en) * | 2007-09-28 | 2010-12-28 | The Boeing Company | Method involving a pointing instrument and a target object |
US9541505B2 (en) | 2009-02-17 | 2017-01-10 | The Boeing Company | Automated postflight troubleshooting sensor array |
US9418496B2 (en) * | 2009-02-17 | 2016-08-16 | The Boeing Company | Automated postflight troubleshooting |
US8812154B2 (en) | 2009-03-16 | 2014-08-19 | The Boeing Company | Autonomous inspection and maintenance |
US8977528B2 (en) * | 2009-04-27 | 2015-03-10 | The Boeing Company | Bonded rework simulation tool |
US8503720B2 (en) | 2009-05-01 | 2013-08-06 | Microsoft Corporation | Human body pose estimation |
US9108738B1 (en) | 2009-05-19 | 2015-08-18 | The Boeing Company | Apparatus for refueling aircraft |
US9046892B2 (en) | 2009-06-05 | 2015-06-02 | The Boeing Company | Supervision and control of heterogeneous autonomous operations |
US8568545B2 (en) * | 2009-06-16 | 2013-10-29 | The Boeing Company | Automated material removal in composite structures |
GB201002973D0 (en) * | 2010-02-23 | 2010-04-07 | Airbus Operations Ltd | Recording the location of a point of interest on an object |
US8773289B2 (en) | 2010-03-24 | 2014-07-08 | The Boeing Company | Runway condition monitoring |
US8599044B2 (en) | 2010-08-11 | 2013-12-03 | The Boeing Company | System and method to assess and report a health of a tire |
US8712634B2 (en) | 2010-08-11 | 2014-04-29 | The Boeing Company | System and method to assess and report the health of landing gear related components |
US8982207B2 (en) | 2010-10-04 | 2015-03-17 | The Boeing Company | Automated visual inspection system |
US8711206B2 (en) * | 2011-01-31 | 2014-04-29 | Microsoft Corporation | Mobile camera localization using depth maps |
US8942917B2 (en) | 2011-02-14 | 2015-01-27 | Microsoft Corporation | Change invariant scene recognition by an agent |
CN102096739B (en) * | 2011-02-15 | 2012-10-10 | 中国航空工业集团公司西安飞机设计研究所 | Aircraft fuel amount measurement sensor layout optimization design method |
US9953039B2 (en) * | 2011-07-19 | 2018-04-24 | Disney Enterprises, Inc. | Method and system for providing a compact graphical user interface for flexible filtering of data |
CN102254041B (en) * | 2011-08-15 | 2012-10-10 | 中国航空工业集团公司西安飞机设计研究所 | Standard design working condition determining method used for building quality characteristic database of spirit of boomer and carrier aircrafts |
WO2013050338A1 (en) | 2011-10-03 | 2013-04-11 | Asml Netherlands B.V. | Method to provide a patterned orientation template for a self-assemblable polymer |
US9117185B2 (en) | 2012-09-19 | 2015-08-25 | The Boeing Company | Forestry management system |
US9857470B2 (en) | 2012-12-28 | 2018-01-02 | Microsoft Technology Licensing, Llc | Using photometric stereo for 3D environment modeling |
US9251590B2 (en) | 2013-01-24 | 2016-02-02 | Microsoft Technology Licensing, Llc | Camera pose estimation for 3D reconstruction |
US9940553B2 (en) | 2013-02-22 | 2018-04-10 | Microsoft Technology Licensing, Llc | Camera/object pose from predicted coordinates |
JP6444027B2 (en) * | 2013-12-09 | 2018-12-26 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, information processing system, and program |
FR3021442B1 (en) * | 2014-05-21 | 2018-01-12 | Airbus Group Sas | METHOD OF PROCESSING LOCAL INFORMATION |
MX2018008373A (en) * | 2016-01-07 | 2018-11-09 | Walmart Apollo Llc | Systems and methods of mapping storage facilities. |
US10048753B1 (en) * | 2017-04-20 | 2018-08-14 | Robert C. Brooks | Perspective or gaze based visual identification and location system |
US11194019B2 (en) | 2018-04-30 | 2021-12-07 | Faro Technologies, Inc. | System and method of one touch registration of three-dimensional scans with an augmented reality enabled mobile computing device |
US20210176319A1 (en) * | 2019-12-06 | 2021-06-10 | Zurn Industries, Llc | Water management system and user interface |
US11108865B1 (en) | 2020-07-27 | 2021-08-31 | Zurn Industries, Llc | Battery powered end point device for IoT applications |
US11913345B2 (en) | 2021-07-26 | 2024-02-27 | General Electric Company | System and method of using a tool assembly |
US11555734B1 (en) | 2022-02-18 | 2023-01-17 | Zurn Industries, Llc | Smart and cloud connected detection mechanism and real-time internet of things (IoT) system management |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4685054A (en) * | 1983-06-30 | 1987-08-04 | Valtion Teknillinen Tutkimuskeskus | Method and apparatus for outlining the environment of a multiarticular duty machine by means of a laser pointer |
US5222155A (en) * | 1991-03-26 | 1993-06-22 | Massachusetts Institute Of Technology | Computer apparatus and method for fuzzy template shape matching using a scoring function |
US20030043964A1 (en) * | 2001-08-31 | 2003-03-06 | Jetray Corporation | Inspection system and method |
US20030139836A1 (en) * | 2002-01-24 | 2003-07-24 | Ford Global Technologies, Inc. | Paint defect automated seek and repair assembly and method |
US20080141072A1 (en) * | 2006-09-21 | 2008-06-12 | Impact Technologies, Llc | Systems and methods for predicting failure of electronic systems and assessing level of degradation and remaining useful life |
US7627447B2 (en) * | 2007-01-23 | 2009-12-01 | The Boeing Company | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7272254B2 (en) | 2003-07-09 | 2007-09-18 | General Electric Company | System and method for analyzing and identifying flaws in a manufactured part |
-
2007
- 2007-01-23 US US11/626,168 patent/US7627447B2/en active Active
-
2008
- 2008-01-02 WO PCT/US2008/050035 patent/WO2008091714A1/en active Application Filing
- 2008-01-02 GB GB0914689A patent/GB2459803B/en active Active
-
2009
- 2009-10-21 US US12/603,276 patent/US20100042382A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4685054A (en) * | 1983-06-30 | 1987-08-04 | Valtion Teknillinen Tutkimuskeskus | Method and apparatus for outlining the environment of a multiarticular duty machine by means of a laser pointer |
US5222155A (en) * | 1991-03-26 | 1993-06-22 | Massachusetts Institute Of Technology | Computer apparatus and method for fuzzy template shape matching using a scoring function |
US20030043964A1 (en) * | 2001-08-31 | 2003-03-06 | Jetray Corporation | Inspection system and method |
US6636581B2 (en) * | 2001-08-31 | 2003-10-21 | Michael R. Sorenson | Inspection system and method |
US20030139836A1 (en) * | 2002-01-24 | 2003-07-24 | Ford Global Technologies, Inc. | Paint defect automated seek and repair assembly and method |
US6714831B2 (en) * | 2002-01-24 | 2004-03-30 | Ford Motor Company | Paint defect automated seek and repair assembly and method |
US20080141072A1 (en) * | 2006-09-21 | 2008-06-12 | Impact Technologies, Llc | Systems and methods for predicting failure of electronic systems and assessing level of degradation and remaining useful life |
US7627447B2 (en) * | 2007-01-23 | 2009-12-01 | The Boeing Company | Method and apparatus for localizing and mapping the position of a set of points on a digital model |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140297485A1 (en) * | 2013-03-29 | 2014-10-02 | Lexmark International, Inc. | Initial Calibration of Asset To-Be-Tracked |
US10681183B2 (en) * | 2014-05-28 | 2020-06-09 | Alexander Hertel | Platform for constructing and consuming realm and object featured clouds |
US20150350378A1 (en) * | 2014-05-28 | 2015-12-03 | Alexander Hertel | Platform for Constructing and Consuming Realm and Object Feature Clouds |
US9723109B2 (en) * | 2014-05-28 | 2017-08-01 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US20170324843A1 (en) * | 2014-05-28 | 2017-11-09 | Alexander Hertel | Platform for Constructing and Consuming Realm and Object Featured Clouds |
US11729245B2 (en) | 2014-05-28 | 2023-08-15 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US11368557B2 (en) | 2014-05-28 | 2022-06-21 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US9511496B2 (en) | 2014-06-20 | 2016-12-06 | The Boeing Company | Robot alignment systems and methods of aligning a robot |
US10976161B2 (en) | 2014-08-01 | 2021-04-13 | Google Llc | Use of offline algorithm to determine location from previous sensor data when location is requested |
US10209062B1 (en) * | 2014-08-01 | 2019-02-19 | Google Llc | Use of offline algorithm to determine location from previous sensor data when location is requested |
US11525678B2 (en) | 2014-08-01 | 2022-12-13 | Google Llc | Use of offline algorithm to determine location from previous sensor data when location is requested |
US9862096B2 (en) | 2015-03-30 | 2018-01-09 | The Boeing Company | Automated dynamic manufacturing systems and related methods |
US10102629B1 (en) * | 2015-09-10 | 2018-10-16 | X Development Llc | Defining and/or applying a planar model for object detection and/or pose estimation |
Also Published As
Publication number | Publication date |
---|---|
WO2008091714A1 (en) | 2008-07-31 |
US7627447B2 (en) | 2009-12-01 |
GB2459803A (en) | 2009-11-11 |
US20080177411A1 (en) | 2008-07-24 |
GB0914689D0 (en) | 2009-09-30 |
GB2459803B (en) | 2011-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7627447B2 (en) | Method and apparatus for localizing and mapping the position of a set of points on a digital model | |
US10096129B2 (en) | Three-dimensional mapping of an environment | |
US11668571B2 (en) | Simultaneous localization and mapping (SLAM) using dual event cameras | |
Liang et al. | Image based localization in indoor environments | |
US8792726B2 (en) | Geometric feature extracting device, geometric feature extracting method, storage medium, three-dimensional measurement apparatus, and object recognition apparatus | |
Kümmerle et al. | Large scale graph-based SLAM using aerial images as prior information | |
Baltzakis et al. | Fusion of laser and visual data for robot motion planning and collision avoidance | |
JP5759161B2 (en) | Object recognition device, object recognition method, learning device, learning method, program, and information processing system | |
CN112785702A (en) | SLAM method based on tight coupling of 2D laser radar and binocular camera | |
JP2018522345A5 (en) | ||
US20210049784A1 (en) | Localization of a surveying instrument | |
JP2021530821A (en) | Methods, equipment and computer programs for performing 3D wireless model construction | |
Huang et al. | A novel multi-planar LIDAR and computer vision calibration procedure using 2D patterns for automated navigation | |
KR102075844B1 (en) | Localization system merging results of multi-modal sensor based positioning and method thereof | |
KR20090088516A (en) | Method for self-localization of a robot based on object recognition and environment information around the recognized object | |
RU2572637C2 (en) | Parallel or serial reconstructions in online and offline modes for 3d measurements of rooms | |
US9727978B2 (en) | Method for extracting outer space feature information from spatial geometric data | |
Li et al. | 3D triangulation based extrinsic calibration between a stereo vision system and a LIDAR | |
Zhang | LILO: A Novel Lidar–IMU SLAM System With Loop Optimization | |
EP3088983B1 (en) | Moving object controller and program | |
Cheng et al. | AR-based positioning for mobile devices | |
JP7160257B2 (en) | Information processing device, information processing method, and program | |
Liu et al. | Outdoor camera calibration method for a GPS & camera based surveillance system | |
Spampinato et al. | An embedded stereo vision module for 6D pose estimation and mapping | |
Spampinato et al. | An embedded stereo vision module for industrial vehicles automation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |