CA2577205A1 - Design software incorporating efficient 3-d rendering - Google Patents
Design software incorporating efficient 3-d rendering Download PDFInfo
- Publication number
- CA2577205A1 CA2577205A1 CA002577205A CA2577205A CA2577205A1 CA 2577205 A1 CA2577205 A1 CA 2577205A1 CA 002577205 A CA002577205 A CA 002577205A CA 2577205 A CA2577205 A CA 2577205A CA 2577205 A1 CA2577205 A1 CA 2577205A1
- Authority
- CA
- Canada
- Prior art keywords
- design
- user input
- recited
- blanks
- processing unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/20—Configuration CAD, e.g. designing by assembling or positioning modules selected from libraries of predesigned modules
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Architecture (AREA)
- Mathematical Analysis (AREA)
- Structural Engineering (AREA)
- Pure & Applied Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Stored Programmes (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Design software in accordance with an implementation of the present invention is configured to provide believable three-dimensional representations of user selections in real-time.
Design elements that would otherwise be difficult to efficiently render three-dimensionally in real-time are prerendered for realistic visual effects, such as realistic shading, which correspond to various positions of the elements in a design space. Blanks of the visual effects for each position an then stored in a data store for visual effects. At run time, data associated with user design choices, as well as the blanks for any corresponding design elements are fed in one implementation to peripheral. processing hardware, such as a GPU, which sends the processed data to a display device. The user is therefore able to view complex visual data of certain design choices efficiently with added realism.
Design elements that would otherwise be difficult to efficiently render three-dimensionally in real-time are prerendered for realistic visual effects, such as realistic shading, which correspond to various positions of the elements in a design space. Blanks of the visual effects for each position an then stored in a data store for visual effects. At run time, data associated with user design choices, as well as the blanks for any corresponding design elements are fed in one implementation to peripheral. processing hardware, such as a GPU, which sends the processed data to a display device. The user is therefore able to view complex visual data of certain design choices efficiently with added realism.
Claims (21)
1. In a computerized environment in which a design program is loaded in memory, and processed at a central processing unit, a method of efficiently rendering realistic three-dimensional views of a user's design choices during run-time comprising:
an act of receiving. user input regarding, the positioning of a design element in a design space, the user input including one or more attributes associated with the design element;
an act of retrieving a blank for the design element from a data store;
and a step for providing an accurate three-dimensional view of the user input at a display device through communication with a graphical processing unit, such that the graphical processing unit processes and provides to the display device accurate visual effect data for the design element.
an act of receiving. user input regarding, the positioning of a design element in a design space, the user input including one or more attributes associated with the design element;
an act of retrieving a blank for the design element from a data store;
and a step for providing an accurate three-dimensional view of the user input at a display device through communication with a graphical processing unit, such that the graphical processing unit processes and provides to the display device accurate visual effect data for the design element.
2. The method as recited in claim 1, wherein the step for providing an accurate three-dimensional view of the user input comprises the acts of:
creating a combined data stream that includes one or more of the blanks for the design element and any of the one or more attributes;
processing the combined data stream at the graphical processing unit;
and passing the processed combined data stream to the display device upon selection of a three-dimensional view.
creating a combined data stream that includes one or more of the blanks for the design element and any of the one or more attributes;
processing the combined data stream at the graphical processing unit;
and passing the processed combined data stream to the display device upon selection of a three-dimensional view.
3. The method as recited in claim 2, wherein a graphical processing engine creates the combined data stream and passes the combined data stream to the graphical processing unit.
4. The method as recited in claim 2, wherein the user input is received initially via a central processing unit that is processing instructions associated with the design software and one or more peripheral devices for receiving the user input.
5. The method as recited in claim 4, wherein the act of receiving user input comprises an act of receiving data associated with the user input at the graphical processing engine after the user input has, been received initially via a central processing unit.
6. The method as recited in claim 2, wherein the combined data stream further comprises object data, the object data being based at least in part on the one or more attributes.
7. The method as recited in claim 2, further comprising an act of receiving user input at a graphical processing engine via the design software, the user input indicating a preference for a three-dimensional view.
8. The method as recited in claim 7, wherein the blank comprises, data about a corresponding visual effect that has been separated from any of size, material, or color data for the design element.
9. The method as recited in claim 8, further comprising:
receiving additional user input regarding the position of a position of the design element in a three-dimensional view; and retrieving from the data store a different blank corresponding to the change in position of the design element.
receiving additional user input regarding the position of a position of the design element in a three-dimensional view; and retrieving from the data store a different blank corresponding to the change in position of the design element.
10. In a computerized environment in which a software program receives input from a user regarding design choices for a space, a method of prerendering one or more visual effects for one or more selectable elements, such that the user's selections for the given elements can be rendered in a believable representation for display in real-time, comprising the acts of:
identifying one or more positions of an element to be placed in a space;
rendering a visual effect for each of the one or more positions;
creating one or more blanks corresponding to each of the one or more positions, the one or more blanks containing data about a corresponding visual effect for the element; and passing the created one or more blanks to a data store, such that the one or more blanks can later be accessed by a graphical processing unit in response to user input for the element.
identifying one or more positions of an element to be placed in a space;
rendering a visual effect for each of the one or more positions;
creating one or more blanks corresponding to each of the one or more positions, the one or more blanks containing data about a corresponding visual effect for the element; and passing the created one or more blanks to a data store, such that the one or more blanks can later be accessed by a graphical processing unit in response to user input for the element.
11. The method as recited in claim 10, wherein the one or more created blanks comprise data about a corresponding visual effect that has been separated from any of position, size, material, or color data for the element.
12. The method as recited in claim 10, wherein a central processing unit processes, instructions associated with rendering the visual effect and with creating the one or more blanks.
13. The method as recited in claim 10, passing to a graphical processing engine the element, one or more positions of the element in the space, and corresponding one or more blanks associated with the element.
14. The method as recited in claim 10, further comprising:
receiving a first user input indicating a preference for a first three-dimensional view of the element; and passing a first of the one or more blanks to a graphical processing engine.
receiving a first user input indicating a preference for a first three-dimensional view of the element; and passing a first of the one or more blanks to a graphical processing engine.
15. The method as recited in claim 14, further comprising:
receiving a second user input indicating a preference for a second three-dimensional view of the element that shows the element from a different position compared to the first three-dimensional view; and passing a second of the one or more blanks to the graphical processing engine.
receiving a second user input indicating a preference for a second three-dimensional view of the element that shows the element from a different position compared to the first three-dimensional view; and passing a second of the one or more blanks to the graphical processing engine.
16. The method as recited in claim 10, wherein the element is a design element such as a desk, table, or lamp, and wherein the space is a design space such as an interior or exterior space for use in an architectural design application.
17. In a computerized environment in which a design program receives input from a user regarding design choices for an interior or exterior space, a computerized design system configured to render design elements in a design space in a believable and efficient manner comprising:
a first processor configured to process one or more of user input identifying one or more positions of a design element in a design space, and an identity of one or more visual effect blanks corresponding to the one or more positions; and a second processor configured to generate pixel information for one or more positions of the design element in the design space, for the corresponding one or more blanks, and for any of color, size, or material information for the design element.
a first processor configured to process one or more of user input identifying one or more positions of a design element in a design space, and an identity of one or more visual effect blanks corresponding to the one or more positions; and a second processor configured to generate pixel information for one or more positions of the design element in the design space, for the corresponding one or more blanks, and for any of color, size, or material information for the design element.
18. The computerized design system as recited in claim 17, further comprising:
one or more input devices; and a display device communicatively coupled to graphical hardware on which the second processor is mounted.
one or more input devices; and a display device communicatively coupled to graphical hardware on which the second processor is mounted.
19. The computerized design system as recited in claim. 17, wherein the first processor is a central processing unit and the second processor is a graphical processing unit.
20. The computerized design system as recited in claim 17, wherein the first processor receives instructions directly from the design software loaded into main memory, and wherein the second processor receives instructions indirectly from the design software via a graphical processing engine.
21. In a computerized system in a computerized environment in which a design program is loaded in memory and processed at a central processing unit, a computer program product having computer-executable instructions stored thereon that, when executed, cause one or more components of the computerized system to perform a method of realistically and efficiently rendering, three-dimensional views of a user's design choices during run-time comprising:
an act of receiving user input regarding the positioning of a design element in a design space, the user input including one or more attributes associated with the design element;
an act of retrieving a blank for the design element from a data store;
and a step for providing an accurate three-dimensional view of the user input at a display device through communication with a graphical processing unit, such that the graphical processing unit processes and provides to the display device accurate visual effect data for the design element.
an act of receiving user input regarding the positioning of a design element in a design space, the user input including one or more attributes associated with the design element;
an act of retrieving a blank for the design element from a data store;
and a step for providing an accurate three-dimensional view of the user input at a display device through communication with a graphical processing unit, such that the graphical processing unit processes and provides to the display device accurate visual effect data for the design element.
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US60223304P | 2004-08-17 | 2004-08-17 | |
US60/602,233 | 2004-08-17 | ||
US11/204,420 | 2005-08-16 | ||
US11/204,419 US7249005B2 (en) | 2004-08-17 | 2005-08-16 | Design software incorporating efficient 3-D rendering |
US11/204,421 | 2005-08-16 | ||
US11/204,421 US8751950B2 (en) | 2004-08-17 | 2005-08-16 | Capturing a user's intent in design software |
US11/204,420 US7277830B2 (en) | 2004-08-17 | 2005-08-16 | Capturing a user's design intent with resolvable objects |
US11/204,419 | 2005-08-16 | ||
PCT/IB2005/003434 WO2006018744A2 (en) | 2004-08-17 | 2005-08-17 | Design software incorporating efficient 3-d rendering |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2577205A1 true CA2577205A1 (en) | 2006-02-23 |
CA2577205C CA2577205C (en) | 2012-10-23 |
Family
ID=35907784
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2577199A Active CA2577199C (en) | 2004-08-17 | 2005-08-17 | Capturing a user's intent in design software |
CA2577202A Active CA2577202C (en) | 2004-08-17 | 2005-08-17 | Capturing a user's design intent with resolvable objects |
CA2577205A Active CA2577205C (en) | 2004-08-17 | 2005-08-17 | Design software incorporating efficient 3-d rendering |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2577199A Active CA2577199C (en) | 2004-08-17 | 2005-08-17 | Capturing a user's intent in design software |
CA2577202A Active CA2577202C (en) | 2004-08-17 | 2005-08-17 | Capturing a user's design intent with resolvable objects |
Country Status (2)
Country | Link |
---|---|
CA (3) | CA2577199C (en) |
WO (3) | WO2006018740A2 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8751950B2 (en) | 2004-08-17 | 2014-06-10 | Ice Edge Business Solutions Ltd. | Capturing a user's intent in design software |
US20050071135A1 (en) | 2003-09-30 | 2005-03-31 | Vredenburgh David W. | Knowledge management system for computer-aided design modeling |
US8510672B2 (en) | 2004-08-17 | 2013-08-13 | Dirtt Environmental Solutions Ltd | Automatically creating and modifying furniture layouts in design software |
WO2009111885A1 (en) | 2008-03-11 | 2009-09-17 | Dirtt Environmental Solutions, Ltd. | Automatically creating and modifying furniture layouts in design software |
US7908296B2 (en) | 2006-02-16 | 2011-03-15 | Dirtt Environmental Solutions, Ltd. | Integrating object-oriented design software with record-based CAD software |
US8762941B2 (en) | 2006-02-16 | 2014-06-24 | Dirtt Environmental Solutions, Ltd. | Rendering and modifying CAD design entities in object-oriented applications |
CA2781638C (en) | 2009-11-24 | 2019-06-04 | Ice Edge Business Solutions Inc. | Securely sharing design renderings over a network |
WO2012173741A2 (en) | 2011-06-11 | 2012-12-20 | Dirtt Environmental Solutions Inc. | Automated re-use of structural components |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5557537A (en) * | 1990-07-12 | 1996-09-17 | Normann; Linda M. | Method and apparatus for designing and editing a distribution system for a building |
US5293479A (en) * | 1991-07-08 | 1994-03-08 | Quintero Smith Incorporated | Design tool and method for preparing parametric assemblies |
US6888542B1 (en) * | 1999-01-27 | 2005-05-03 | Autodesk, Inc. | Error recovery in a computer aided design environment |
KR100488113B1 (en) * | 1999-08-03 | 2005-05-06 | 켄이치 니노미야 | Design support system, design support method, and medium storing design support program |
EP1098244A3 (en) * | 1999-11-02 | 2001-06-13 | CANAL + Société Anonyme | Graphical user interface |
US7505044B2 (en) * | 2000-07-31 | 2009-03-17 | Bowsher M William | Universal ultra-high definition color, light, and object rendering, advising, and coordinating system |
US7523411B2 (en) * | 2000-08-22 | 2009-04-21 | Bruce Carlin | Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements |
US6988091B2 (en) * | 2001-04-02 | 2006-01-17 | Richard Levine | Method and system for coordination of CAD drawings providing collision detection between drawings |
US7130775B2 (en) * | 2001-11-07 | 2006-10-31 | Kajima Corporation | Integrated building production information system |
US7277572B2 (en) * | 2003-10-10 | 2007-10-02 | Macpearl Design Llc | Three-dimensional interior design system |
-
2005
- 2005-08-17 WO PCT/IB2005/003248 patent/WO2006018740A2/en active Application Filing
- 2005-08-17 CA CA2577199A patent/CA2577199C/en active Active
- 2005-08-17 CA CA2577202A patent/CA2577202C/en active Active
- 2005-08-17 CA CA2577205A patent/CA2577205C/en active Active
- 2005-08-17 WO PCT/IB2005/003298 patent/WO2006018742A2/en active Application Filing
- 2005-08-17 WO PCT/IB2005/003434 patent/WO2006018744A2/en not_active Application Discontinuation
Also Published As
Publication number | Publication date |
---|---|
WO2006018742A2 (en) | 2006-02-23 |
WO2006018744A2 (en) | 2006-02-23 |
WO2006018742A3 (en) | 2006-09-08 |
CA2577205C (en) | 2012-10-23 |
CA2577202C (en) | 2011-11-08 |
CA2577199C (en) | 2015-12-22 |
WO2006018740A3 (en) | 2006-09-08 |
WO2006018744A3 (en) | 2006-06-29 |
CA2577202A1 (en) | 2006-02-23 |
WO2006018740A2 (en) | 2006-02-23 |
CA2577199A1 (en) | 2006-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2577205A1 (en) | Design software incorporating efficient 3-d rendering | |
CN109448099B (en) | Picture rendering method and device, storage medium and electronic device | |
US20180033210A1 (en) | Interactive display system with screen cut-to-shape of displayed object for realistic visualization and user interaction | |
US11954805B2 (en) | Occlusion of virtual objects in augmented reality by physical objects | |
EP3090424A1 (en) | Assigning virtual user interface to physical object | |
WO2015140815A1 (en) | Real-time customization of a 3d model representing a real product | |
Montero et al. | Designing and implementing interactive and realistic augmented reality experiences | |
US20170169613A1 (en) | Displaying an object with modified render parameters | |
EP1624367A3 (en) | Three-dimentional motion graphic user interface and method and apparatus for providing the same | |
CN106683193B (en) | Design method and design device of three-dimensional model | |
US20150213632A1 (en) | Rendering an image on a display screen | |
US20160350981A1 (en) | Image processing method and device | |
US11195323B2 (en) | Managing multi-modal rendering of application content | |
EP1922700B1 (en) | 2d/3d combined rendering | |
Pietriga et al. | Representation-independent in-place magnification with sigma lenses | |
US9043707B2 (en) | Configurable viewcube controller | |
WO2016200571A1 (en) | Adjusted location hologram display | |
US20160353085A1 (en) | Video processing method and device | |
CN108549479A (en) | The realization method and system of multichannel virtual reality, electronic equipment | |
CN111260768B (en) | Picture processing method and device, storage medium and electronic device | |
JP2005071332A5 (en) | ||
CN104598182B (en) | Three-dimensional demonstration method and device for document | |
JP2012155731A (en) | Retrieval system | |
CN109697001A (en) | The display methods and device of interactive interface, storage medium, electronic device | |
JP2007108834A (en) | Advertisement display simulation device, advertisement display simulation program, and advertisement display simulation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |