US20150279087A1 - 3d data to 2d and isometric views for layout and creation of documents - Google Patents

3d data to 2d and isometric views for layout and creation of documents Download PDF

Info

Publication number
US20150279087A1
US20150279087A1 US14/671,420 US201514671420A US2015279087A1 US 20150279087 A1 US20150279087 A1 US 20150279087A1 US 201514671420 A US201514671420 A US 201514671420A US 2015279087 A1 US2015279087 A1 US 2015279087A1
Authority
US
United States
Prior art keywords
method
dimensional model
model data
boundaries
set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/671,420
Inventor
Stephen Brooks Myers
Jacob Abraham Kuttothara
Steven Donald Paddock
John Moore Wathen
Andrew Slatton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Knockout Concepts LLC
Original Assignee
Knockout Concepts LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201461971036P priority Critical
Application filed by Knockout Concepts LLC filed Critical Knockout Concepts LLC
Priority to US14/671,420 priority patent/US20150279087A1/en
Assigned to KNOCKOUT CONCEPTS, LLC reassignment KNOCKOUT CONCEPTS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEYERS, STEPHEN B
Assigned to KNOCKOUT CONCEPTS, LLC reassignment KNOCKOUT CONCEPTS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUTTOTHARA, JACOB A, PADDOCK, STEVEN D, SLATTON, ANDREW, WATHEN, JOHN M
Publication of US20150279087A1 publication Critical patent/US20150279087A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/26Measuring arrangements characterised by the use of optical means for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00201Recognising three-dimensional objects, e.g. using range or tactile information
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/03Detection or correction of errors, e.g. by rescanning the pattern
    • G06K9/036Evaluation of quality of acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • G06K9/4604Detecting partial patterns, e.g. edges or contours, or configurations, e.g. loops, corners, strokes, intersections
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K2209/00Indexing scheme relating to methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K2209/40Acquisition of 3D measurements of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

This application relates to methods for generating two-dimensional images from three-dimensional model data. A process according to the application may begin with providing a set of three-dimensional model data of a subject, and determining a set of boundaries between intersecting surfaces of the set of three-dimensional model data. A user or an algorithm may select a view of the three-dimensional model data to convert to a two-dimensional image. The process may further include determining an outline of the three-dimensional model corresponding to the selected view, and projecting the outline of the three-dimensional model and a visible portion of the set of boundaries onto a two-dimensional image plane.

Description

    I. BACKGROUND OF THE INVENTION
  • A. Field of Invention
  • Embodiments generally relate to creating technical drawings from 3D model data.
  • B. Description of the Related Art
  • A variety of methods are known in the art for generating 2D images from 3D models. For instance, it is known to generate a collage of 2D renderings that represent a 3D model. It is further known to identify vertices and edges of objects in images. The prior art also includes methods for flattening 3D surfaces to 2D quadrilateral line drawings in a 2D image plane. However, the art is deficient in a number of regards. For instance, the prior art does not teach or suggest fitting a 3D point cloud to a set of simple 2D surfaces, determining boundaries and vertices of the 2D surfaces and projecting them onto an image plane.
  • Some embodiments of the present invention may provide one or more benefits or advantages over the prior art.
  • II. SUMMARY OF THE INVENTION
  • Some embodiments may relate to a method for generating two-dimensional images, comprising the steps of: providing a set of three-dimensional model data of a subject; determining a set of boundaries between intersecting surfaces of the set of three-dimensional model data; selecting a view of the three-dimensional model data to convert to a two-dimensional image; determining an outline of the three-dimensional model data corresponding to the selected view of the three-dimensional model data; determining the portion of the set of boundaries that would be invisible in the selected view due to opacity of the subject; and projecting the outline of the three-dimensional model data and the visible portion of the set of boundaries onto a two-dimensional image plane.
  • Embodiments may further comprise projecting the invisible boundaries on the two-dimensional image plane in a form visually distinguishable from the visible boundaries.
  • According to some embodiments the form visually distinguishable from the visible boundaries comprises dashed, dotted, or broken lines.
  • According to some embodiments the three-dimensional model data comprises a point cloud.
  • Embodiments may further comprise the step of converting the point cloud to a set of continuous simple surfaces using a fitting method selected from one or more of a random sample consensus (RANSAC) method, an iterative closest point method, a least squares method, a Newtonian method, a quasi-Newtonian method, or an expectation-maximization method.
  • According to some embodiments a simple surface comprises a planar surface, a cylindrical surface, a spherical surface, a sinusoidal surface, or a conic surface.
  • According to some embodiments the step of selecting a view comprises orienting a three-dimensional model defined by the three-dimensional model data so that the planar bounded region with the largest convex hull is visible.
  • According to some embodiments the step of determining a set of boundaries comprises a Kreveld method, a Dey Wang method, or an iterative simple surface intersection method.
  • According to some embodiments the three-dimensional model data comprises a mesh.
  • According to some embodiments the step of determining a set of boundaries comprises finding sharp angles between intersecting simple surfaces according to a dihedral angle calculation.
  • Other benefits and advantages will become apparent to those skilled in the art to which it pertains upon reading and understanding of the following detailed specification.
  • III. BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention may take physical form in certain parts and arrangement of parts, embodiments of which will be described in detail in this specification and illustrated in the accompanying drawings which form a part hereof and wherein:
  • FIG. 1 is a flowchart showing an image conversion process according to an embodiment of the invention;
  • FIG. 2 is a schematic view of a user capturing 3D model data with a 3D scanning device;
  • FIG. 3 is a drawing of a point cloud being converted into an isometric drawing;
  • FIG. 4 is a drawing showing the use of a set of simple surfaces for generating 2D drawings;
  • FIG. 5 is a drawing of a device according to an embodiment of the invention; and
  • FIG. 6 is an illustrative printout according to an embodiment of the invention.
  • IV. DETAILED DESCRIPTION OF THE INVENTION
  • A method for generating two-dimensional images includes determining a set of boundaries between intersecting surfaces of three-dimensional model data corresponding to an object. A specific view of the three-dimensional model data, for which the two-dimensional images are required, is selected. Upon selection of the specific view, the outline of the three-dimensional model data corresponding to the selected view is determined and corresponding invisible portion of the boundaries, due to opacity of the object, is identified. The outline of the three-dimensional model data and the visible portion of the boundaries so determined are projected onto a two-dimensional image plane.
  • Referring now to the drawings wherein the showings are for purposes of illustrating embodiments of the invention only and not for purposes of limiting the same, FIG. 1 depicts a flow diagram 100 of an illustrative embodiment wherein three-dimensional data 110 is provided for the purpose of generating corresponding two-dimensional images. The three dimensional data may be in the form of point cloud or mesh representation of a three-dimensional subject. Furthermore, any and all other forms of three-dimensional data representation, now known or developed in the future, that are capable of being converted to point cloud or mesh form may be used.
  • The point cloud or mesh may be further converted to a set or sets of continuous simple surfaces by using a fitting method including but not limited to a random sample consensus (RANSAC) method, an iterative closest point method, a least squares method, a Newtonian method, a quasi-Newtonian method, or an expectation-maximization method. All these methods are well understood in the art and their methodologies are incorporated by reference herein. Any simple geometric surface including but not limited to a planar surface, cylindrical surface, spherical surface, sinusoidal surface, or a conic surface may be used to represent the point cloud as the set of simple continuous surfaces.
  • A set of boundaries between intersecting surfaces of the three-dimensional model data is determined 112. In an illustrative embodiment this determination of a set of boundaries may be achieved by using a Kreveld method, a Dey Wang method, or an iterative simple surface intersection method. All these methods are well understood in the art and their methodologies are incorporated by reference herein. In an alternate embodiment wherein the three-dimensional model data is represented as a mesh, the set of boundaries may be determined by finding sharp angles between intersecting simple surfaces according to a dihedral angle calculation.
  • Once the set of boundaries between intersecting surfaces of the three-dimensional model data is determined, a view of the image data for which two-dimensional images are required is selected 114. In one embodiment, the view may be selected by orienting a three-dimensional model defined by the three-dimensional model data so that the planar bounded region with the largest convex hull is visible. Based on the view selected, and outline of the image data corresponding to the view is determined 116. In one embodiment, the outline determination may be based upon selecting the portion of the image data from one visible edge to the other in the selected view. Also, the portion of the set of boundaries that would be invisible in the selected view due to opacity of the subject is determined 118. In another embodiment, the portion of the set of visible boundaries in the selected viewpoint is determined thereby excluding the invisible boundaries. The determined outline and the visible portion of the set of boundaries are projected on a two-dimensional image plane 120.
  • In another embodiment, the invisible portion of the boundaries may also be depicted on a 2D image plane in a manner that distinguishes the invisible boundaries from the visible boundaries. One illustrative mechanism of distinguishing invisible boundaries from visible ones may involve use of dashed, dotted, or broken lines.
  • FIG. 2 depicts an illustrative embodiment 200 wherein a three-dimensional scanner 210 is used to scan and obtain three-dimensional model data 216 of a real world subject 212. The three-dimensional model data 216 is obtained by scanning the subject 212 from various directions and orientations 214. The image scanner 210 may be any known or future developed 3D scanner including but not limited to mobile devices, smart phones or tablets configured to scan and obtain three-dimensional model data.
  • FIG. 3 depicts an illustrative embodiment 300 wherein the three-dimensional model data of the real world subject is represented in the form of a point cloud 310. This point cloud representation may be further converted to a set or sets of continuous simple surfaces 312. As discussed previously herein, this conversion may be achieved by using a fitting method including but not limited to a random sample consensus (RANSAC) method, an iterative closest point method, a least squares method, a Newtonian method, a quasi-Newtonian method, or an expectation-maximization method. The simple surfaces used to represent the point cloud may be any simple geometric surface (polygonal and cylindrical surface in this case) including but not limited to planar surface, cylindrical surface, spherical surface, sinusoidal surface, or a conic surface. In one embodiment, a set of boundaries between the intersecting simple surfaces is determined using various methods known in the art including but not limited to a Kreveld method, a Dey Wang method, or an iterative simple surface intersection method. In another embodiment, where a mesh model is used instead of point cloud, the boundaries may also be determined by finding sharp angles between intersecting simple surfaces according to a dihedral angle calculation.
  • FIG. 4 depicts an illustrative embodiment 400 wherein the three-dimensional model data, represented as a set of continuous simple surfaces 312, is used for 2D image generation. A view of the set of continuous simple surfaces 312 is chosen and the determined outline and the portion of visible set of boundaries corresponding to the chosen view is projected on a two-dimensional image plane. For example the top view may be chosen and projected 412 or the front view 416 or side view 414 may be chosen and projected. Optionally, the invisible boundaries 418 may be depicted using dashed, dotted, or broken lines. Furthermore, because of the nature of the image data collected and reconstructed, it is possible to produce drawings having precise dimensions, such as the ones shown in FIG. 4 elements 412 and 414.
  • It is also contemplated to include a dimensional standard in the collected 3D model data so that drawings can be made to scale, i.e. a 1:1 scale, with the identical measurements of the real world object being modeled. For instance, in some embodiments the scanning device may be equipped with features for measuring its distance from the object being scanned, and may therefore be capable of accurately determining dimensions. Embodiments may also include the ability to manipulate scale, so that a drawing of a very large object can be rendered in a more manageable scale such as 1:10. It may further be advantageous to include dimensions on the 3D or 2D drawings produced according to embodiments of the invention in the form of annotations similar to those shown in FIG. 4 elements 412 and 414.
  • FIG. 5 depicts an embodiment 500 wherein a user device is illustrated, such device 510 with a capacitive touch screen 512 and interface may be configured to either carry out the method provided herein or to receive the 2D images and other related data using the method provided herein. The device 510 may be any device with computing and processing capabilities including but not limited to user mobile phones, tablets, smart phones and the like. The device 510 may be adapted to display the point cloud 310 of the scanned subject and the corresponding set of continuous simple surfaces 312. Also, the various views such as top view 412, side view 414 and front view 416 are also displayed on the screen 512 of the device 510. The device 510 may connect to a printing device 520 to enable physical printing of the 2D images and other related information. It will be understood that images may be stored in the form of digital documents as well, and that the invention is not limited to printed documents. The device 520 may be connected to the printing device 520 through a wire connection 518 or wirelessly 516. The wireless connection 516 with the printing device 520 may include Wi-Fi, Bluetooth or any other now known or future developed method of wireless connectivity. There may be contextual touch screen buttons 514 on the screen 512 of the device 510 that may be configured to carry out various actions like execute print command, zoom in/out, select different views of the set of continuous simple surfaces 312 etc.
  • FIG. 6 depicts an illustrative embodiment 600 of a physical print or digital document 610 of the 2D images obtained using the methods described herein. A two-dimensional representation of the set of continuous simple surfaces 312, various 2D images such as top view 412, side view 414 and front view 416 may be depicted in the document 610. The document 610 may also contain additional information in the form of notes 612 or annotations with respect to the 2D images, and a header 614 and footer 616 section. For instance, embodiments of the invention may include the ability to precisely measure the actual dimensions of an object being scanned, therefore, notes and annotations may include, without limitation, the volume of the object, the objects dimensions, its texture and color, its location as determined by an onboard GPS, time and date that the scan was taken, the operator's name, or any other data that may be convenient to store with the scan data. If the average density of the object is known even the weight of the object could be determined and displayed in notes.
  • It will be apparent to those skilled in the art that the above methods and apparatuses may be changed or modified without departing from the general scope of the invention. The invention is intended to include all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
  • Having thus described the invention, it is now claimed:

Claims (17)

I/we claim:
1. A method for generating two-dimensional images, comprising the steps of:
providing a set of three-dimensional model data of a subject;
determining a set of boundaries between intersecting surfaces of the set of three-dimensional model data;
selecting a view of the three-dimensional model data to convert to a two-dimensional image;
determining an outline of the three-dimensional model data corresponding to the selected view of the three-dimensional model data; and
projecting the outline of the three-dimensional model data and a visible portion of the set of boundaries onto a two-dimensional image plane.
2. The method of claim 1, further comprising the step of determining the portion of the set of boundaries that would be invisible in the selected view due to opacity of the subject.
3. The method of claim 1, further comprising the step of projecting the invisible boundaries on the two-dimensional image plane in a form visually distinguishable from the visible boundaries.
4. The method of claim 2, wherein the form visually distinguishable from the visible boundaries comprises dashed, dotted, or broken lines.
5. The method of claim 1, wherein the three-dimensional model data comprises a point cloud.
6. The method of claim 5, further comprising the step of converting the point cloud to a set of continuous simple surfaces using a fitting method selected from one or more of a random sample consensus (RANSAC) method, an iterative closest point method, a least squares method, a Newtonian method, a quasi-Newtonian method, or an expectation-maximization method.
7. The method of claim 6, wherein a simple surface comprises a planar surface, a cylindrical surface, a spherical surface, a sinusoidal surface, or a conic surface.
8. The method of claim 1, wherein the step of selecting a view comprises orienting a three-dimensional model defined by the three-dimensional model data so that the planar bounded region with the largest convex hull is visible.
9. The method of claim 5, wherein the step of determining a set of boundaries comprises a Kreveld method, a Dey Wang method, or an iterative simple surface intersection method.
10. The method of claim 1, wherein the three-dimensional model data comprises a mesh.
11. The method of claim 10, wherein the step of determining a set of boundaries comprises finding sharp angles between intersecting simple surfaces according to a dihedral angle calculation.
12. A method for generating two-dimensional images, comprising the steps of:
providing a set of three-dimensional model data of a subject, wherein the three-dimensional model data comprises a point cloud;
converting the point cloud to a set of continuous simple surfaces using a fitting method selected from one or more of a random sample consensus (RANSAC) method, an iterative closest point method, a least squares method, a Newtonian method, a quasi-Newtonian method, or an expectation-maximization method, wherein a simple surface comprises a planar surface, a cylindrical surface, a spherical surface, a sinusoidal surface, or a conic surface;
determining a set of boundaries between intersecting the simple surfaces, wherein the step of determining a set of boundaries comprises a Kreveld method, a Dey Wang method, or an iterative simple surface intersection method;
selecting a view of the three-dimensional model data to convert to a two-dimensional image, wherein the step of selecting a view comprises orienting a three-dimensional model defined by the three-dimensional model data so that the planar bounded region with the largest convex hull is visible;
determining an outline of the three-dimensional model data corresponding to the selected view of the three-dimensional model data;
determining the portion of the set of boundaries that would be invisible in the selected view due to opacity of the subject; and
projecting the outline of the three-dimensional model data and the visible portion of the set of boundaries onto a two-dimensional image plane.
13. The method of claim 12, further comprising projecting the invisible boundaries on the two-dimensional image plane in a form visually distinguishable from the visible boundaries.
14. The method of claim 13, wherein the form visually distinguishable from the visible boundaries comprises dashed, dotted, or broken lines.
15. A method for generating two-dimensional images, comprising the steps of:
providing a set of three-dimensional model data of a subject, wherein the three-dimensional model data comprises a mesh;
determining a set of boundaries between intersecting surfaces of the set of three-dimensional model data, wherein the step of determining a set of boundaries comprises finding sharp angles between intersecting simple surfaces according to a dihedral angle calculation;
selecting a view of the three-dimensional model data to convert to a two-dimensional image;
determining an outline of the three-dimensional model data corresponding to the selected view of the three-dimensional model data;
determining the portion of the set of boundaries that would be invisible in the selected view due to opacity of the subject; and
projecting the outline of the three-dimensional model data and the visible portion of the set of boundaries onto a two-dimensional image plane.
16. The method of claim 15, further comprising projecting the invisible boundaries on the two-dimensional image plane in a form visually distinguishable from the visible boundaries.
17. The method of claim 16, wherein the form visually distinguishable from the visible boundaries comprises dashed, dotted, or broken lines.
US14/671,420 2014-03-27 2015-03-27 3d data to 2d and isometric views for layout and creation of documents Abandoned US20150279087A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201461971036P true 2014-03-27 2014-03-27
US14/671,420 US20150279087A1 (en) 2014-03-27 2015-03-27 3d data to 2d and isometric views for layout and creation of documents

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/671,420 US20150279087A1 (en) 2014-03-27 2015-03-27 3d data to 2d and isometric views for layout and creation of documents

Publications (1)

Publication Number Publication Date
US20150279087A1 true US20150279087A1 (en) 2015-10-01

Family

ID=54189850

Family Applications (5)

Application Number Title Priority Date Filing Date
US14/672,048 Active 2035-11-14 US9841277B2 (en) 2014-03-27 2015-03-27 Graphical feedback during 3D scanning operations for obtaining optimal scan resolution
US14/671,749 Abandoned US20150279121A1 (en) 2014-03-27 2015-03-27 Active Point Cloud Modeling
US14/671,420 Abandoned US20150279087A1 (en) 2014-03-27 2015-03-27 3d data to 2d and isometric views for layout and creation of documents
US14/671,313 Abandoned US20150279075A1 (en) 2014-03-27 2015-03-27 Recording animation of rigid objects using a single 3d scanner
US14/671,373 Abandoned US20150278155A1 (en) 2014-03-27 2015-03-27 Identifying objects using a 3d scanning device, images, and 3d models

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/672,048 Active 2035-11-14 US9841277B2 (en) 2014-03-27 2015-03-27 Graphical feedback during 3D scanning operations for obtaining optimal scan resolution
US14/671,749 Abandoned US20150279121A1 (en) 2014-03-27 2015-03-27 Active Point Cloud Modeling

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/671,313 Abandoned US20150279075A1 (en) 2014-03-27 2015-03-27 Recording animation of rigid objects using a single 3d scanner
US14/671,373 Abandoned US20150278155A1 (en) 2014-03-27 2015-03-27 Identifying objects using a 3d scanning device, images, and 3d models

Country Status (1)

Country Link
US (5) US9841277B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160071327A1 (en) * 2014-09-05 2016-03-10 Fu Tai Hua Industry (Shenzhen) Co., Ltd. System and method for simplifying a mesh point cloud
US20160188159A1 (en) * 2014-12-30 2016-06-30 Dassault Systemes Selection of a viewpoint of a set of objects
US20160196659A1 (en) * 2015-01-05 2016-07-07 Qualcomm Incorporated 3d object segmentation
EP3188049A1 (en) * 2015-12-30 2017-07-05 Dassault Systèmes Density based graphical mapping
US10049479B2 (en) 2015-12-30 2018-08-14 Dassault Systemes Density based graphical mapping
US10127333B2 (en) 2015-12-30 2018-11-13 Dassault Systemes Embedded frequency based search and 3D graphical data processing
US20190005709A1 (en) * 2017-06-30 2019-01-03 Apple Inc. Techniques for Correction of Visual Artifacts in Multi-View Images
US10360438B2 (en) 2015-12-30 2019-07-23 Dassault Systemes 3D to 2D reimaging for search
US10754242B2 (en) 2017-06-30 2020-08-25 Apple Inc. Adaptive resolution and projection format in multi-direction video
US10762595B2 (en) * 2018-11-07 2020-09-01 Steelcase, Inc. Designated region projection printing of spatial pattern for 3D object on flat sheet in determined orientation

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160125638A1 (en) * 2014-11-04 2016-05-05 Dassault Systemes Automated Texturing Mapping and Animation from Images
CN105551078A (en) * 2015-12-02 2016-05-04 北京建筑大学 Method and system of virtual imaging of broken cultural relics
CN106524920A (en) * 2016-10-25 2017-03-22 上海建科工程咨询有限公司 Application of field measurement in construction project based on three-dimensional laser scanning
CN106650700A (en) * 2016-12-30 2017-05-10 上海联影医疗科技有限公司 Motif, and method and device for measuring system matrix
US10600230B2 (en) * 2018-08-10 2020-03-24 Sheng-Yen Lin Mesh rendering system, mesh rendering method and non-transitory computer readable medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102456A1 (en) * 2009-10-30 2011-05-05 Synopsys, Inc. Drawing an image with transparent regions on top of another image without using an alpha channel

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003021532A2 (en) 2001-09-06 2003-03-13 Koninklijke Philips Electronics N.V. Method and apparatus for segmentation of an object
US8108929B2 (en) * 2004-10-19 2012-01-31 Reflex Systems, LLC Method and system for detecting intrusive anomalous use of a software system using multiple detection algorithms
US7860301B2 (en) 2005-02-11 2010-12-28 Macdonald Dettwiler And Associates Inc. 3D imaging system
US7965868B2 (en) * 2006-07-20 2011-06-21 Lawrence Livermore National Security, Llc System and method for bullet tracking and shooter localization
US7768656B2 (en) 2007-08-28 2010-08-03 Artec Group, Inc. System and method for three-dimensional measurement of the shape of material objects
KR20090047172A (en) * 2007-11-07 2009-05-12 삼성디지털이미징 주식회사 Method for controlling digital camera for picture testing
US8255100B2 (en) * 2008-02-27 2012-08-28 The Boeing Company Data-driven anomaly detection to anticipate flight deck effects
DE102008021558A1 (en) * 2008-04-30 2009-11-12 Advanced Micro Devices, Inc., Sunnyvale Process and system for semiconductor process control and monitoring using PCA models of reduced size
WO2009140582A2 (en) * 2008-05-16 2009-11-19 Geodigm Corporation Method and apparatus for combining 3d dental scans with other 3d data sets
EP2297705B1 (en) * 2008-06-30 2012-08-15 Thomson Licensing Method for the real-time composition of a video
US8750446B2 (en) * 2008-08-01 2014-06-10 Broadcom Corporation OFDM frame synchronisation method and system
US8896607B1 (en) * 2009-05-29 2014-11-25 Two Pic Mc Llc Inverse kinematics for rigged deformable characters
WO2011014192A1 (en) * 2009-07-31 2011-02-03 Analogic Corporation Two-dimensional colored projection image from three-dimensional image data
GB0913930D0 (en) * 2009-08-07 2009-09-16 Ucl Business Plc Apparatus and method for registering two medical images
WO2012115862A2 (en) * 2011-02-22 2012-08-30 3M Innovative Properties Company Space carving in 3d data acquisition
EP2707834B1 (en) * 2011-05-13 2020-06-24 Vizrt Ag Silhouette-based pose estimation
US8724880B2 (en) * 2011-06-29 2014-05-13 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and medical image processing apparatus
EP2780826B1 (en) * 2011-11-15 2020-08-12 Trimble Inc. Browser-based collaborative development of a 3d model
US20150153476A1 (en) * 2012-01-12 2015-06-04 Schlumberger Technology Corporation Method for constrained history matching coupled with optimization
US9208550B2 (en) 2012-08-15 2015-12-08 Fuji Xerox Co., Ltd. Smart document capture based on estimated scanned-image quality
DE102013203667A1 (en) * 2013-03-04 2014-09-04 Adidas Ag Interactive booth and method for determining a body shape
WO2015006791A1 (en) 2013-07-18 2015-01-22 A.Tron3D Gmbh Combining depth-maps from different acquisition methods
US20150070468A1 (en) 2013-09-10 2015-03-12 Faro Technologies, Inc. Use of a three-dimensional imager's point cloud data to set the scale for photogrammetry

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102456A1 (en) * 2009-10-30 2011-05-05 Synopsys, Inc. Drawing an image with transparent regions on top of another image without using an alpha channel

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Chivate et al. "Extending surfaces for reverse engineering solid model generation", Department of Mechanical Engineering, PennsylÕania State UniÕersity, UniÕersity Park, PA 16802, USA Received 7 December 1994; accepted 27 August 1996 *
Wang et al. “Impulse-Based Rendering Methods for Haptic Simulation of Bone-Burring”, IEEE TRANSACTIONS ON HAPTICS, VOL. 5, NO. 4, OCTOBER-DECEMBER 2012 *
Zhuang et al. "Simplifying Complex CAD Geometry with Conservative Bounding Contours", Proceedings of the 1997 IEEE International Conference on Robotics and Automation Albuquerque, New Mexico - April 1997 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160071327A1 (en) * 2014-09-05 2016-03-10 Fu Tai Hua Industry (Shenzhen) Co., Ltd. System and method for simplifying a mesh point cloud
US9830686B2 (en) * 2014-09-05 2017-11-28 Fu Tai Hua Industry (Shenzhen) Co., Ltd. System and method for simplifying a mesh point cloud
US20160188159A1 (en) * 2014-12-30 2016-06-30 Dassault Systemes Selection of a viewpoint of a set of objects
US20160196659A1 (en) * 2015-01-05 2016-07-07 Qualcomm Incorporated 3d object segmentation
US9866815B2 (en) * 2015-01-05 2018-01-09 Qualcomm Incorporated 3D object segmentation
EP3188049A1 (en) * 2015-12-30 2017-07-05 Dassault Systèmes Density based graphical mapping
US10049479B2 (en) 2015-12-30 2018-08-14 Dassault Systemes Density based graphical mapping
US10127333B2 (en) 2015-12-30 2018-11-13 Dassault Systemes Embedded frequency based search and 3D graphical data processing
US10360438B2 (en) 2015-12-30 2019-07-23 Dassault Systemes 3D to 2D reimaging for search
US20190005709A1 (en) * 2017-06-30 2019-01-03 Apple Inc. Techniques for Correction of Visual Artifacts in Multi-View Images
US10754242B2 (en) 2017-06-30 2020-08-25 Apple Inc. Adaptive resolution and projection format in multi-direction video
US10762595B2 (en) * 2018-11-07 2020-09-01 Steelcase, Inc. Designated region projection printing of spatial pattern for 3D object on flat sheet in determined orientation

Also Published As

Publication number Publication date
US9841277B2 (en) 2017-12-12
US20150279075A1 (en) 2015-10-01
US20150279121A1 (en) 2015-10-01
US20150278155A1 (en) 2015-10-01
US20150276392A1 (en) 2015-10-01

Similar Documents

Publication Publication Date Title
US10515480B1 (en) Automated three dimensional model generation
KR101993920B1 (en) Method and apparatus for representing physical scene
US9734634B1 (en) Augmented reality product preview
JP6083747B2 (en) Position and orientation detection system
US9776364B2 (en) Method for instructing a 3D printing system comprising a 3D printer and 3D printing system
US10529141B2 (en) Capturing and aligning three-dimensional scenes
Moons et al. 3D Reconstruction from Multiple Images: Principles
US9208547B2 (en) Stereo correspondence smoothness tool
JP6015032B2 (en) Provision of location information in a collaborative environment
KR101636027B1 (en) Methods and systems for capturing and moving 3d models and true-scale metadata of real world objects
CN104330022B (en) Method and system is determined using the volume of structure from motion algorithm
JP6635690B2 (en) Information processing apparatus, information processing method and program
AU2014203440B2 (en) Information processing device, position designation method
CN102812416B (en) Pointing input device, indicative input method, program, recording medium and integrated circuit
TW201709718A (en) Method and apparatus for displaying a light field based image on a user's device, and corresponding computer program product
US9953112B2 (en) Method and system for displaying room interiors on a floor plan
Bernardini et al. The 3D model acquisition pipeline
Pears et al. 3D imaging, analysis and applications
US10127199B2 (en) Automatic measure of visual similarity between fonts
US20130004060A1 (en) Capturing and aligning multiple 3-dimensional scenes
KR101841668B1 (en) Apparatus and method for producing 3D model
CN107111833A (en) Quick 3D model adaptations and anthropological measuring
CN105074617B (en) Three-dimensional user interface device and three-dimensional manipulating processing method
JP5991423B2 (en) Display device, display method, display program, and position setting system
US6954212B2 (en) Three-dimensional computer modelling

Legal Events

Date Code Title Description
AS Assignment

Owner name: KNOCKOUT CONCEPTS, LLC, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEYERS, STEPHEN B;REEL/FRAME:035776/0218

Effective date: 20150528

Owner name: KNOCKOUT CONCEPTS, LLC, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUTTOTHARA, JACOB A;WATHEN, JOHN M;PADDOCK, STEVEN D;AND OTHERS;REEL/FRAME:035776/0299

Effective date: 20150528

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION