WO2011129880A1 - Method for augmenting reality by controlling equipment with a mobile device - Google Patents

Method for augmenting reality by controlling equipment with a mobile device Download PDF

Info

Publication number
WO2011129880A1
WO2011129880A1 PCT/US2011/000662 US2011000662W WO2011129880A1 WO 2011129880 A1 WO2011129880 A1 WO 2011129880A1 US 2011000662 W US2011000662 W US 2011000662W WO 2011129880 A1 WO2011129880 A1 WO 2011129880A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
cad
callout
application
block
Prior art date
Application number
PCT/US2011/000662
Other languages
French (fr)
Inventor
Frank S. Ruotolo
Filip T. Peters
Original Assignee
Titansan Engineering, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Titansan Engineering, Inc. filed Critical Titansan Engineering, Inc.
Publication of WO2011129880A1 publication Critical patent/WO2011129880A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present invention relates in general to merging or overlaying virtual imagery with real world imagery, and more particularly to merging real world images and video that are displayed on a mobile device with laser or video projections from a projection device that is controlled by the mobile device.
  • FIG. 1A illustrates one embodiment of the "Native CAD View Application.”
  • Fig. IB illustrates one embodiment of the "Import CAD Process of the Native CAD View Application.”
  • Fig. 1C illustrates one embodiment of the "View Controls Process of the Native CAD View Application.”
  • FIG. 2A illustrates one embodiment of the "Native CAD Markup Application.”
  • FIG. 2B illustrates one embodiment of the "User Markup Process of the Native CAD View Application.”
  • FIG. 2C illustrates one detailed embodiment of the "View Controls Process of the Native CAD Markup Application.”
  • FIG. 3A illustrates one embodiment of the "Native CAD Project Application.”
  • Fig. 3B illustrates one embodiment of the "Selection Process of the Native CAD Project Application.”
  • Fig. 3C illustrates one embodiment of the "Send Control-Path Process of the Native CAD Project Application.”
  • FIG. 4A illustrates one embodiment of the "Native CAD Inspect Application.”
  • FIG. 4B illustrates one embodiment of the "Data Process of the CAD Inspect Application.”
  • Fig. 4C illustrates one embodiment of the "Report Process of the CAD Inspect Application.”
  • Fig. 5A illustrates a mobile device depicting a graphical representation of a component assembly process carried out in accordance with the principles of the invention.
  • Fig. 5B illustrates a mobile device depicting a graphical representation of how a laying process may be carried out using the principles of the invention.
  • Fig. 5C illustrates a mobile device depicting a topographical projection in the real world, as controlled by a mobile device in accordance with the principles of the invention.
  • Fig. 5D illustrates a mobile device depicting a graphical representation of an exemplary manufacturing process for a wired component, carried out in accordance with the principles of the invention.
  • Callout 774 Controls Along/#/Dev
  • Callout 781 Select Device
  • Callout 783 Select Control-Path
  • Callout 784 Send.
  • the methods includes receiving, by the mobile device, a native computer aided design (CAD) file modeling a physical component, and then executing the native CAD file so as to display the modeling of the physical component on a display screen of the mobile device.
  • the method further includes receiving a user input to overlay one or more virtual laser projections onto the displayed modeling of the physical component, and then transmitting instructions to a connected laser projector, wherein the instructions are to cause the laser projector to project one or more physical laser projections, corresponding to the one or more virtual laser projections, onto the physical component.
  • CAD computer aided design
  • On aspect of the invention is to effectively enhance real world imagery by merging, on a mobile device, images and video from the real world with and projections from laser or video from a projection device, controlled by the mobile device, made in the real world.
  • reality is augmented in capability and functionality through the integration of a mobile device, native CAD (computer- aided design) data, and real world images and video (see illustrative embodiments in Figs. 1A— 1C).
  • CAD computer- aided design
  • real world images and video see illustrative embodiments in Figs. 1A— 1C.
  • One or more aspects of the invention are carried out the novel achievement of importing 'native' CAD files onto a mobile device.
  • the importer is present on the mobile device itself, not on an external computer.
  • prior art systems required either 1) a conversion of the CAD data to another form, such as a triangulated mesh rendition of the CAD data, or 2) relied on an external computer system (e.g., over a network connection) to transmit image information for display on the subject mobile device.
  • Another related aspect of the invention is, by virtue of having all of the native CAD data local to the mobile device, it is further possible to have the mobile device itself perform all in-process calculations, such as projecting a point (XYZ position) to a surface (CAD data).
  • the above aspects of the invention allow the matching of the virtual and physical worlds.
  • laser projectors and other measurement- capable devices may be used to collect known data points in the physical world in order to match to the CAD rendered virtual world on the mobile device to the physical world.
  • the principles of the invention allow for the augmentation of reality in both the outbound sense and the inbound sense. In the outbound sense, control of a laser projection by the mobile device is realized in the real world by the laser projection onto the manufactured part or assembly of interest.
  • the laser projector hardware may be used to guide the operator where to: 1) place a component as part of an assembly process, or 2) place a composite-ply as part of a manufacturing process, or 3) other tasks such as paint-markup, etc.
  • the resulting laser projection may be displayed on the mobile device overlaid onto real world imagery (camera image) of the component of interest.
  • the terms “a” or “an” shall mean one or more than one.
  • the term “plurality” shall mean two or more than two.
  • the term “another” is defined as a second or more.
  • the terms “including” and/or “having” are open ended (e.g., comprising).
  • the term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means any of the following: A; B; C; A and B; A and C; B and C; A, B and C. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
  • Fig. 1A illustrates one embodiment of the "Native CAD View Application" as designed for a mobile device.
  • the mobile device may be any mobile phone (e.g., like an iPhoneTM), media-pad (e.g., an iPadTM, iPod- TouchTM) or other similar handheld device.
  • Fig. IB illustrates one embodiment of the "Import CAD Process of the Native CAD View Application.”
  • the Import CAD process is improved by maintaining the original native CAD data, a proprietary mathematical representation of the same data, and a proprietary viewable representation of the same data.
  • the proprietary formats may be created as soon as the model is first available. Intelligent decisions regarding which model or piece of model to load improves the process.
  • FIG. 1C illustrates one embodiment of the "View Controls Process of the Native CAD View Application.”
  • the View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
  • reality is augmented in capability and functionality through the integration of a mobile device, native CAD data, user- defined markup - draw and tags, and images and video of the real world (see illustrative embodiments in Figs. 2 A - 2C).
  • FIG. 2A illustrates one embodiment of the "Native CAD Markup Application.”
  • the Native CAD Markup Application expands on the Native CAD View Application by adding markup controls - draw and tag functionality.
  • Fig. 2B illustrates one embodiment of the "User Markup Process of the Native CAD Markup Application.”
  • the User Markup process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
  • Fig. 2C illustrates one embodiment of the "View Controls Process of the Native CAD Markup Application.”
  • the View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
  • reality is augmented in capability and functionality through the integration of a mobile device and a piece of equipment (see illustrative embodiments in Figs. 3A - 3C) that returns new information back into the real world.
  • the equipment can be any data input device whether used for measurement, testing, sensing (vibration analysis, size, distance, etc), inspection, quality control, assembly, reverse engineering, automation, robot control, surgical and medical equipment control, communication and reporting, device and peripherals control, marking, etching, projection or other.
  • Fig. 3A illustrates one embodiment of the "Native CAD Project Application.”
  • the Native CAD Project Application expands on the Native CAD Markup Application by adding an interface and controls to a projection device.
  • a projection device may be a laser or video projector and is used to overlay an image from the mobile device onto real world imagery (pictures, video, etc.).
  • Fig. 3B illustrates one embodiment of the "Selection Process of the Native CAD Project Application.”
  • the selection process allows the user to choose surfaces and curves and define appropriate controls for projection.
  • Example controls are based on view, deviation within tolerance from the perfect form, and steps along the projection within the speed of the projector.
  • Fig. 3C illustrates one embodiment of the "Send Control-Path Process of the Native CAD Project Application.”
  • the Send Control-Path Process interfaces and controls a projection device.
  • the Native CAD View Application is a mobile device application for the import and viewing of native CAD files for the Design, Engineering, Manufacturing, Assembly, Quality Control, Reverse Engineering, Projection, Composite Ply Layout, Marking, Etching, and Painting markets.
  • the Native CAD View Application can be used for design & engineering, part & feature comparison, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
  • native CAD data is imported (block 102).
  • the import is improved by maintaining the original, native CAD data, a proprietary mathematical representation of the same data, and a proprietary viewable representation of the same data.
  • the proprietary formats are created as soon as the model is first available.
  • the Import CAD process begins with a check if this is the first data read (block 721). If this is not the first read, then the view-mesh may be read (block 728). If this is the first read, then the import process starts the read (block 722), determines the file type (block 723), and checks if this file type is supported (block 724). If this file type is not supported then the read may stop (block 725).
  • a proprietary mathematical representation of the CAD data may be created and saved (block 726) and a proprietary viewable representation of the CAD data created and saved (block 727). Intelligent decisions regarding which model or piece of model to load improves the process.
  • the View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
  • a user interface loop with View Controls (block 703).
  • the user can opt to zoom (block 742), pan (block 743), or rotate (block 744) the image.
  • the image may be the native CAD data, a viewable representation of the data, a real world image, real world video, or any combination of the above.
  • a resultant image may be generated and emailed (block 761) and/or saved (block 762).
  • the Native CAD Markup Application expands on the Native CAD View Application by adding markup controls - draw and tag functionality.
  • the Native CAD Markup Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
  • native CAD data is imported (block 102).
  • the user can draw and erase on the data (block 752).
  • the user can tag features of the data (block 753), assign a name to the tag (block 754) and attach text, image, video, or other information (block 755).
  • the view controls (block 703) work both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
  • a resultant image is generated and either emailed (block 761) or saved (block 762).
  • the Native CAD Project Application expands on the Native CAD Markup Application by adding an interface and controls to a projection device.
  • the projection device may be a laser or video projector and is used to overlay an image from the mobile device into the real world.
  • the Native CAD Project Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
  • a CAD file is imported (block 102).
  • the user can choose solids, surfaces, faces, features, forms, and curves and define appropriate controls for projection.
  • Example controls are based on view, deviation within tolerance from the perfect form, and steps along within the speed of the projector.
  • a control-path is sent to a projection device (block 781) and feedback is received (block 764). If there are multiple devices available, the user can select which device (block 781), with which parameters (block 782), and which control-path (block 783) before sending (block 784).
  • the Native CAD Inspect Application is a mobile device application for the Design, Engineering, Manufacturing, Assembly, Painting, Quality Control, Reverse Engineering markets.
  • the CAD Inspect Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
  • the Native CAD Inspect Application requires CAD (computer-aided design) of the part to be inspected or other.
  • the CAD data is the nominal information to which the measured data, or actual information, is compared. This actual-to-nominal using CAD is the core of the CAD Inspect Application.
  • a CAD file is imported (block 102), a plan for inspection may be created (block 104) and/or loaded (block 103). In a loop, commands are sent and data is received (block 106) and processed (block 120). Lastly, a report may be generated (block 140) and emailed (block 161) and/or saved (block 162).
  • the flow of process data may be fit geometrically (block 121) using, for example, least-squares fitting algorithms, results compared to the CAD or nominal data (block 125), results checked against tolerances (block 130) and the result either accepted or rejected (block 135). If rejected, then the process data may repeat.
  • the flow of creating a report select a type of report (block 142), then to create the appropriate report, either HTML (block 145) or Tabular (block 150) and finally to view the report (block 155).
  • the mobile device 500 is shown as displaying a graphical representation of a CAD modeled component assembly 510.
  • a user interacts with the Native CAD View application executing on the mobile device to overlay desired virtual laser projection lines 520 onto the CAD model 510.
  • a physical laser projector would then be controlled by the mobile device to generate real versions of projection lines 520.
  • the real world orientation of the physical laser projector can be seen from the fact that the projection emanates diagonally from the top-right of the modeled component assembly 510.
  • the invention preserves and displays the position and orientation of the laser as referenced in the real world relative to the modeled component assembly 510.
  • Fig. 5B depicts the mobile device 500 executing the Native CAD View application in which a composite ply lay-up or layering of sheets of composite materials is shown. The result is a build up a 3-dimensional thickness.
  • the laser projector is used to guide the operator.
  • the mobile device is used to simplify the process through simulation (before) and process control (during) and verification (after).
  • Fig. 5C depicts the mobile device 500 executing the Native CAD View Application in which a physical laser is used to make topographical projection 530 in our physical world.
  • the applications are many including showing high/low areas, damaged areas, area of interest, etc.
  • a smaller mobile device is shown in the lower-left corner to demonstrate the simulation while the camera image from the mobile device shows the application.
  • FIG. 5D shows mobile device 500 executing the Native CAD View Application in order to aide in the manufacturing for a wired component, carried out in accordance with the principles of the invention.
  • application is used to generate the virtual feature 540 on a component being manufactured, which is then projected/expressed in the real world by a connected physical laser.
  • the projected real world laser may then used as a guide to the operator to let him know what to do and where and when (in what order). Labels (such as "P0" in this example), can also be projected.
  • the mobile device, while controlling the laser, can also guide the operator to the correct position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Disclosed is a method for augmenting reality, working simultaneously in both the real and virtual worlds, by extending a mobile device to make sensing, testing, measurement, projection, layout, marking, etching, painting and related processes completely portable, more intuitive, and highly interactive. The improvements are in accuracy, speed, and ease-of-use leading to new applications and use in new markets. In certain embodiments, virtual imagery may be merged or overlaid with real world imagery (e.g, pictures, video, etc.). For example, real world images and video that are displayed on a mobile device may be merged with laser or video projections from a projection device that is controlled by the mobile device.

Description

METHOD FOR AUGMENTING REALITY BY CONTROLLING
EQUIPMENT WITH A MOBILE DEVICE
FIELD QF THE INVENTION
[0001] The present invention relates in general to merging or overlaying virtual imagery with real world imagery, and more particularly to merging real world images and video that are displayed on a mobile device with laser or video projections from a projection device that is controlled by the mobile device.
BACKGROUND
[0002] Traditionally, fixed structures and large computers were used to control equipment for sensing, testing, measurement, projection, marking, etching and related processes. Attempts to move sensing, testing, measurement, projection, marking, and etching tools (equipment and computer) to the device under test have been met with limited success and, in any event, fail to address the actual need in the art.
[0003] Therefore, there is a need in the art for a method, while working simultaneously in both the real and virtual worlds, effectively extends the functionality of a mobile device to make sensing, testing, measurement, projection, marking, etching, painting and related processes completely portable, more intuitive, and highly interactive. In this fashion, improvements in accuracy, speed, and ease-of-use may be realized such that new applications and use in new markets may result.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] The features, objects, and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:
[0005] Fig. 1A illustrates one embodiment of the "Native CAD View Application."
[0006] Fig. IB illustrates one embodiment of the "Import CAD Process of the Native CAD View Application." [0007] Fig. 1C illustrates one embodiment of the "View Controls Process of the Native CAD View Application."
[0008] Fig. 2A illustrates one embodiment of the "Native CAD Markup Application."
[0009] Fig. 2B illustrates one embodiment of the "User Markup Process of the Native CAD View Application."
[0010] Fig. 2C illustrates one detailed embodiment of the "View Controls Process of the Native CAD Markup Application."
[0011] Fig. 3A illustrates one embodiment of the "Native CAD Project Application." [0012] Fig. 3B illustrates one embodiment of the "Selection Process of the Native CAD Project Application."
[0013] Fig. 3C illustrates one embodiment of the "Send Control-Path Process of the Native CAD Project Application."
[0014] Fig. 4A illustrates one embodiment of the "Native CAD Inspect Application."
[0015] Fig. 4B illustrates one embodiment of the "Data Process of the CAD Inspect Application."
[0016] Fig. 4C illustrates one embodiment of the "Report Process of the CAD Inspect Application." [0017] Fig. 5A illustrates a mobile device depicting a graphical representation of a component assembly process carried out in accordance with the principles of the invention.
[0018] Fig. 5B illustrates a mobile device depicting a graphical representation of how a laying process may be carried out using the principles of the invention.
[0019] Fig. 5C illustrates a mobile device depicting a topographical projection in the real world, as controlled by a mobile device in accordance with the principles of the invention.
[0020] Fig. 5D illustrates a mobile device depicting a graphical representation of an exemplary manufacturing process for a wired component, carried out in accordance with the principles of the invention.
BRIEF DESCRIPTION OF THE DRAWING CALLOUTS
[0021] The following are specific callouts referenced in the drawings:
• Callout 101: Start Application,
• Callout 102: Import CAD,
• Callout 103: Load Document/Plan,
• Callout 104: Create Document/Plan,
• Callout 106: Send Commands & Receive Data,
• Callout 120: Process Data,
o Callout 121: Fit Geometry,
o Callout 125: Compare to CAD/Nominal,
o Callout 130: Check Tolerances,
o Callout 135: Accept Result,
o Callout 136: Accept Decision,
• Callout 140: Create Report,
o Callout 142: Type,
o Callout 145: Create HTML Report,
o Callout 150: Create Tabular Report,
o Callout 155: View Report,
• Callout 161: Email Report,
• Callout 162: Store Report on Server,
• Callout 199: End, • Callout 703: View Controls,
• Callout 705: Man/Auto Selection,
• Callout 761: Email Result,
• Callout 762: Save Result,
• Callout 721: Check if First Read,
• Callout 722: Start Read,
• Callout 723: Determine File Type,
• Callout 724: Supported,
• Callout 725: Stop,
• Callout 726: Save Mathematical Data,
• Callout 727: Save View Mesh,
• Callout 728: Read View Mesh,
• Callout 741: User,
• Callout 742: Zoom,
• Callout 743: Pan,
• Callout 744: Rotate,
• Callout 751: User,
• Callout 752: Mark,
• Callout 753: Place Tag,
• Callout 754: Label Tag,
• Callout 755: Enter Info, Callout 763: Send Control-Path, Callout 764: Receive Feedback, Callout 771: Selection,
Callout 772: Select Surface(s), Callout 773: Select Curve(s),
Callout 774: Controls Along/#/Dev, Callout 781: Select Device,
Callout 782: Select Params,
Callout 783: Select Control-Path, and Callout 784: Send.
BRIEF SUMAMRY OF THE INVENTION
[0022] Disclosed and claimed herein are systems and methods for augmenting reality by controlling equipment with a mobile device. In one embodiment, the methods includes receiving, by the mobile device, a native computer aided design (CAD) file modeling a physical component, and then executing the native CAD file so as to display the modeling of the physical component on a display screen of the mobile device. The method further includes receiving a user input to overlay one or more virtual laser projections onto the displayed modeling of the physical component, and then transmitting instructions to a connected laser projector, wherein the instructions are to cause the laser projector to project one or more physical laser projections, corresponding to the one or more virtual laser projections, onto the physical component.
[0023] Other aspects, features, and techniques of the invention will be apparent to one skilled in the relevant art in view of the following detailed description of the invention.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0024] On aspect of the invention is to effectively enhance real world imagery by merging, on a mobile device, images and video from the real world with and projections from laser or video from a projection device, controlled by the mobile device, made in the real world.
[0025] In one embodiment, reality is augmented in capability and functionality through the integration of a mobile device, native CAD (computer- aided design) data, and real world images and video (see illustrative embodiments in Figs. 1A— 1C). [0026] One or more aspects of the invention are carried out the novel achievement of importing 'native' CAD files onto a mobile device. The importer is present on the mobile device itself, not on an external computer. Heretofore, prior art systems required either 1) a conversion of the CAD data to another form, such as a triangulated mesh rendition of the CAD data, or 2) relied on an external computer system (e.g., over a network connection) to transmit image information for display on the subject mobile device. However, by loading and executing the actual native CAD file on the mobile device itself, there is no need to convert the CAD data to any other format, nor to incur the resulting loss of the underlying data of the CAD rendered image. Also, since there is no need to send the CAD data wirelessly to the mobile device, network connectivity and security issues are avoided.
[0027] Another related aspect of the invention is, by virtue of having all of the native CAD data local to the mobile device, it is further possible to have the mobile device itself perform all in-process calculations, such as projecting a point (XYZ position) to a surface (CAD data).
[0028] The above aspects of the invention allow the matching of the virtual and physical worlds. In particular, laser projectors and other measurement- capable devices may be used to collect known data points in the physical world in order to match to the CAD rendered virtual world on the mobile device to the physical world. [0029] Additionally, the principles of the invention allow for the augmentation of reality in both the outbound sense and the inbound sense. In the outbound sense, control of a laser projection by the mobile device is realized in the real world by the laser projection onto the manufactured part or assembly of interest. The laser projector hardware may be used to guide the operator where to: 1) place a component as part of an assembly process, or 2) place a composite-ply as part of a manufacturing process, or 3) other tasks such as paint-markup, etc.
[0030] In the inbound sense, the resulting laser projection may be displayed on the mobile device overlaid onto real world imagery (camera image) of the component of interest.
[0031] As used herein, the terms "a" or "an" shall mean one or more than one. The term "plurality" shall mean two or more than two. The term "another" is defined as a second or more. The terms "including" and/or "having" are open ended (e.g., comprising). The term "or" as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, "A, B or C" means any of the following: A; B; C; A and B; A and C; B and C; A, B and C. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
[0032] Reference throughout this document to "one embodiment", "certain embodiments", "an embodiment" or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.
[0033] In accordance with the practices of persons skilled in the art of computer programming, the invention is described below with reference to operations that are performed by a computer system or a like electronic system. Such operations are sometimes referred to as being computer-executed. It will be appreciated that operations that are symbolically represented include the manipulation by a processor, such as a central processing unit, of electrical signals representing data bits and the maintenance of data bits at memory locations, such as in system memory, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to the data bits
[0034] When implemented in software, the elements of the invention are essentially the code segments to perform the necessary tasks. The code segments can be stored in a "processor storage medium," which includes any medium that can store information. Examples of the processor storage medium include an electronic circuit, a semiconductor memory device, a ROM, a flash memory or other non-volatile memory, a floppy diskette, a CD-ROM, an optical disk, a hard disk, etc. [0035] By way of example, Fig. 1A illustrates one embodiment of the "Native CAD View Application" as designed for a mobile device. The mobile device may be any mobile phone (e.g., like an iPhone™), media-pad (e.g., an iPad™, iPod- Touch™) or other similar handheld device.
[0036] Fig. IB illustrates one embodiment of the "Import CAD Process of the Native CAD View Application." The Import CAD process is improved by maintaining the original native CAD data, a proprietary mathematical representation of the same data, and a proprietary viewable representation of the same data. The proprietary formats may be created as soon as the model is first available. Intelligent decisions regarding which model or piece of model to load improves the process.
[0037] Fig. 1C illustrates one embodiment of the "View Controls Process of the Native CAD View Application." The View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video. [0038] In one embodiment, reality is augmented in capability and functionality through the integration of a mobile device, native CAD data, user- defined markup - draw and tags, and images and video of the real world (see illustrative embodiments in Figs. 2 A - 2C).
[0039] Fig. 2A illustrates one embodiment of the "Native CAD Markup Application." The Native CAD Markup Application expands on the Native CAD View Application by adding markup controls - draw and tag functionality.
[0040] Fig. 2B illustrates one embodiment of the "User Markup Process of the Native CAD Markup Application." The User Markup process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
[0041] Fig. 2C illustrates one embodiment of the "View Controls Process of the Native CAD Markup Application." The View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video. [0042] In one embodiment, reality is augmented in capability and functionality through the integration of a mobile device and a piece of equipment (see illustrative embodiments in Figs. 3A - 3C) that returns new information back into the real world. The equipment can be any data input device whether used for measurement, testing, sensing (vibration analysis, size, distance, etc), inspection, quality control, assembly, reverse engineering, automation, robot control, surgical and medical equipment control, communication and reporting, device and peripherals control, marking, etching, projection or other.
[0043] Fig. 3A illustrates one embodiment of the "Native CAD Project Application." The Native CAD Project Application expands on the Native CAD Markup Application by adding an interface and controls to a projection device. A projection device may be a laser or video projector and is used to overlay an image from the mobile device onto real world imagery (pictures, video, etc.).
[0044] Fig. 3B illustrates one embodiment of the "Selection Process of the Native CAD Project Application." The selection process allows the user to choose surfaces and curves and define appropriate controls for projection. Example controls are based on view, deviation within tolerance from the perfect form, and steps along the projection within the speed of the projector.
[0045] Fig. 3C illustrates one embodiment of the "Send Control-Path Process of the Native CAD Project Application." In certain embodiments, the Send Control-Path Process interfaces and controls a projection device.
The Native CAD View Application— (see illustrative embodiments of Figs. 1A - 1C)
[0046] The Native CAD View Application is a mobile device application for the import and viewing of native CAD files for the Design, Engineering, Manufacturing, Assembly, Quality Control, Reverse Engineering, Projection, Composite Ply Layout, Marking, Etching, and Painting markets. The Native CAD View Application can be used for design & engineering, part & feature comparison, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
[0047] Within the process flow of the Native CAD View Application, native CAD data is imported (block 102). The import is improved by maintaining the original, native CAD data, a proprietary mathematical representation of the same data, and a proprietary viewable representation of the same data. The proprietary formats are created as soon as the model is first available. The Import CAD process begins with a check if this is the first data read (block 721). If this is not the first read, then the view-mesh may be read (block 728). If this is the first read, then the import process starts the read (block 722), determines the file type (block 723), and checks if this file type is supported (block 724). If this file type is not supported then the read may stop (block 725). If the file type is supported, then a proprietary mathematical representation of the CAD data may be created and saved (block 726) and a proprietary viewable representation of the CAD data created and saved (block 727). Intelligent decisions regarding which model or piece of model to load improves the process. The View Controls Process works both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video.
[0048] Within the process flow of the Native CAD View Application is a user interface loop with View Controls (block 703). The user can opt to zoom (block 742), pan (block 743), or rotate (block 744) the image. The image may be the native CAD data, a viewable representation of the data, a real world image, real world video, or any combination of the above. Lastly, a resultant image may be generated and emailed (block 761) and/or saved (block 762).
The Native CAD Markup Application - (see illustrative embodiments of Figs. 2A - 2C) [0049] The Native CAD Markup Application expands on the Native CAD View Application by adding markup controls - draw and tag functionality. The Native CAD Markup Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting. [0050] Within the process flow of the Native CAD Markup Application, native CAD data is imported (block 102). The user can draw and erase on the data (block 752). The user can tag features of the data (block 753), assign a name to the tag (block 754) and attach text, image, video, or other information (block 755). The view controls (block 703) work both in the virtual world and in an augmented reality combining the native CAD data and a real world image or video. Lastly, a resultant image is generated and either emailed (block 761) or saved (block 762).
The Native CAD Project Application - (see illustrative embodiments of Figs. 3A— 3C) [0051] The Native CAD Project Application expands on the Native CAD Markup Application by adding an interface and controls to a projection device. The projection device may be a laser or video projector and is used to overlay an image from the mobile device into the real world. The Native CAD Project Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
[0052] Within the process flow of the Native CAD Project Application, a CAD file is imported (block 102). The user can choose solids, surfaces, faces, features, forms, and curves and define appropriate controls for projection. Example controls are based on view, deviation within tolerance from the perfect form, and steps along within the speed of the projector.
[0053] At the end of the process flow of the Native CAD Project Application, a control-path is sent to a projection device (block 781) and feedback is received (block 764). If there are multiple devices available, the user can select which device (block 781), with which parameters (block 782), and which control-path (block 783) before sending (block 784).
The Native CAD Inspect Application— (see illustrative embodiments of Figs. 4A - 4C) [0054] The Native CAD Inspect Application is a mobile device application for the Design, Engineering, Manufacturing, Assembly, Painting, Quality Control, Reverse Engineering markets. The CAD Inspect Application can be used for design & engineering, first article inspection, part inspection, production inspection, tool building, assembly, composite ply layout, marking, etching, and painting.
[0055] The Native CAD Inspect Application requires CAD (computer-aided design) of the part to be inspected or other. The CAD data is the nominal information to which the measured data, or actual information, is compared. This actual-to-nominal using CAD is the core of the CAD Inspect Application. [0056] Within the process flow of the Native CAD Inspect Application, a CAD file is imported (block 102), a plan for inspection may be created (block 104) and/or loaded (block 103). In a loop, commands are sent and data is received (block 106) and processed (block 120). Lastly, a report may be generated (block 140) and emailed (block 161) and/or saved (block 162). The flow of process data (block 120) may be fit geometrically (block 121) using, for example, least-squares fitting algorithms, results compared to the CAD or nominal data (block 125), results checked against tolerances (block 130) and the result either accepted or rejected (block 135). If rejected, then the process data may repeat. The flow of creating a report (block 140) select a type of report (block 142), then to create the appropriate report, either HTML (block 145) or Tabular (block 150) and finally to view the report (block 155).
[0057] An example measurement plan using the Native CAD Inspect
Application: · Import a complex CAD model,
Align the measurement device to the part,
• Best Fit algorithms for "finding" geometry primitives,
• Comparison of Actuals to Nominals,
• Best Fit surface profiles to "find" the part within a cloud of data, · Reporting results,
Quality acceptance (Go / No-Go), and
• Process control.
[0058] Referring now to Fig. 5A, the mobile device 500 is shown as displaying a graphical representation of a CAD modeled component assembly 510. A user interacts with the Native CAD View application executing on the mobile device to overlay desired virtual laser projection lines 520 onto the CAD model 510. In the real world, a physical laser projector would then be controlled by the mobile device to generate real versions of projection lines 520. The real world orientation of the physical laser projector can be seen from the fact that the projection emanates diagonally from the top-right of the modeled component assembly 510. Thus, the invention preserves and displays the position and orientation of the laser as referenced in the real world relative to the modeled component assembly 510.
[0059] Fig. 5B depicts the mobile device 500 executing the Native CAD View application in which a composite ply lay-up or layering of sheets of composite materials is shown. The result is a build up a 3-dimensional thickness. The laser projector is used to guide the operator. The mobile device is used to simplify the process through simulation (before) and process control (during) and verification (after).
[0060] Fig. 5C depicts the mobile device 500 executing the Native CAD View Application in which a physical laser is used to make topographical projection 530 in our physical world. The applications are many including showing high/low areas, damaged areas, area of interest, etc. A smaller mobile device is shown in the lower-left corner to demonstrate the simulation while the camera image from the mobile device shows the application.
[0061] Finally, FIG. 5D shows mobile device 500 executing the Native CAD View Application in order to aide in the manufacturing for a wired component, carried out in accordance with the principles of the invention. In the case, application is used to generate the virtual feature 540 on a component being manufactured, which is then projected/expressed in the real world by a connected physical laser. The projected real world laser may then used as a guide to the operator to let him know what to do and where and when (in what order). Labels (such as "P0" in this example), can also be projected. The mobile device, while controlling the laser, can also guide the operator to the correct position.
[0062] It should be appreciated that the foregoing is equally applicable to any mobile device and any application in which mobile device capabilities may be extended. It should further be appreciated that any mobile device may be configured to connect to the apparatus, in accordance with the principles of the invention, and that the invention is independent of any particular operating system that the mobile device may be running. [0063] While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Trademarks and copyrights referred to herein are the property of their respective owners.

Claims

CLAIMS What is claimed is:
1. A method for augmenting reality by controlling equipment with a mobile device comprising: receiving, by the mobile device, a native computer aided design (CAD) file modeling a physical component; executing the native CAD file so as to display the modeling of the physical component on a display screen of the mobile device; receiving a user input to overlay one or more virtual laser projections onto the displayed modeling of the physical component; transmitting, by the mobile device, instructions to a connected laser projector, wherein the instructions are to cause the laser projector to project one or more physical laser projections, corresponding to the one or more virtual laser projections, onto the physical component.
2. A system configured to perform the method of claim 1.
PCT/US2011/000662 2010-04-12 2011-04-11 Method for augmenting reality by controlling equipment with a mobile device WO2011129880A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32324510P 2010-04-12 2010-04-12
US61/323,245 2010-04-12

Publications (1)

Publication Number Publication Date
WO2011129880A1 true WO2011129880A1 (en) 2011-10-20

Family

ID=44798955

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/000662 WO2011129880A1 (en) 2010-04-12 2011-04-11 Method for augmenting reality by controlling equipment with a mobile device

Country Status (1)

Country Link
WO (1) WO2011129880A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109696915A (en) * 2019-01-07 2019-04-30 上海托华机器人有限公司 A kind of test method and system
CN111091625A (en) * 2018-10-23 2020-05-01 波音公司 Augmented reality system for manufacturing composite parts

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020113784A1 (en) * 2000-12-29 2002-08-22 Feilmeier Michael Leon Portable computer aided design apparatus and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020113784A1 (en) * 2000-12-29 2002-08-22 Feilmeier Michael Leon Portable computer aided design apparatus and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GAUSEMEIER ET AL.: "Development of a Real Time Image Based Object Recognition Method for Mobile AR-Devices", AFRIGRAPH '03 PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON COMPUTER GRAPHICS, VIRTUAL REALITY, VISUALISATION AND INTERACTION IN AFRICA, 2003, pages 133 - 139, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/610000/602355/p133-gausemeier.pdf?ip=209.155.214.108CFID=280264508CFTOKEN=25569522&_acm_=1307657885_5270a43cdcad0393a4c461de9172bbb4> [retrieved on 20110608] *
HENRYSSON: "Bringing Augmented Reality to Mobile Phones", DISSERTATIONS, NO. 1145, LINKOPING STUDIES IN SCIENCE AND TECHNOLOGY, 2007, Retrieved from the Internet <URL:http://liu.diva-portal.org/smash/get/diva2:16967/FULLTEXT01> [retrieved on 20110608] *
SCHWERDTFEGER ET AL.: "Using laser projectors for augmented reality", PROCEEDING VRST '08 PROCEEDINGS OF THE 2008 ACM SYMPOSIUM ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY, 2008, pages 134 - 137, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/1460000/1450608/p134-schwerdtteger.pdf?ip=209.155.214.10&CFID=279829428CFTOKEN=340306508_acm_=1307655667_b40cb2517628830fa9aa93e73faec00d> [retrieved on 20110608] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111091625A (en) * 2018-10-23 2020-05-01 波音公司 Augmented reality system for manufacturing composite parts
CN109696915A (en) * 2019-01-07 2019-04-30 上海托华机器人有限公司 A kind of test method and system

Similar Documents

Publication Publication Date Title
US11663732B2 (en) System and method for using images from a commodity camera for object scanning, reverse engineering, metrology, assembly, and analysis
Han et al. Potential of big visual data and building information modeling for construction performance analytics: An exploratory study
Ahn et al. 2D drawing visualization framework for applying projection-based augmented reality in a panelized construction manufacturing facility: Proof of concept
Kim et al. Interactive modeler for construction equipment operation using augmented reality
Ammari et al. Collaborative BIM-based markerless mixed reality framework for facilities maintenance
Kim et al. Improvement of realism of 4D objects using augmented reality objects and actual images of a construction site
JP2008065586A (en) Parts identification image creation device, program, and storage medium
US20180204153A1 (en) Architectural Planning Method
Vincke et al. Immersive visualisation of construction site point cloud data, meshes and BIM models in a VR environment using a gaming engine
US11062523B2 (en) Creation authoring point tool utility to recreate equipment
US8311320B2 (en) Computer readable recording medium storing difference emphasizing program, difference emphasizing method, and difference emphasizing apparatus
Yu et al. Collaborative SLAM and AR-guided navigation for floor layout inspection
WO2011129880A1 (en) Method for augmenting reality by controlling equipment with a mobile device
US8244235B2 (en) System and method for extending a mobile device to control, connect and communicate with equipment, networks and systems
JP6842819B2 (en) Road structure inspection information management system
JP7101381B2 (en) Parts management system and parts management method
KR102458559B1 (en) Construction management system and method using mobile electric device
KR101958199B1 (en) Configuration management system by the 3D model, The system for maintenance of small and mediun-sized plant
KR20180090499A (en) Applying method 3D model to goods for augmented reality and virtual reality shopping mall
CN108062786B (en) Comprehensive perception positioning technology application system based on three-dimensional information model
Liu et al. System development of an augmented reality on-site BIM viewer based on the integration of SLAM and BLE indoor positioning
US10599710B2 (en) System and method for generating digital information and altering digital models of components with same
JP2006059014A (en) Device for calculating distance of three-dimensional cad data and measured three-dimensional data, distance calculating method, and its program
JP2004252815A (en) Image display device, its method and program
US20230221120A1 (en) A system and method for remote inspection of a space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11769205

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11769205

Country of ref document: EP

Kind code of ref document: A1