US20160371882A1 - Method and system for displaying and navigating an optimal multi-dimensional building model - Google Patents
Method and system for displaying and navigating an optimal multi-dimensional building model Download PDFInfo
- Publication number
- US20160371882A1 US20160371882A1 US15/255,952 US201615255952A US2016371882A1 US 20160371882 A1 US20160371882 A1 US 20160371882A1 US 201615255952 A US201615255952 A US 201615255952A US 2016371882 A1 US2016371882 A1 US 2016371882A1
- Authority
- US
- United States
- Prior art keywords
- building model
- dimensional building
- view
- optimal
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/176—Urban or other man-made structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/653—Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/56—Particle system, point based geometry or rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2008—Assembling, disassembling
Definitions
- the technology described herein relates generally to an interactive three-dimensional (3D) map system and, in particular, to a system and method for visualizing 3D building models within an interactive 3D building model visualization system.
- Location-based and mobile technologies are often considered the center of the technology revolution of this century. Essential to these are ways to best present location-based information to electronic devices, particularly mobile devices.
- the technology used to represent this information has traditionally been based on a two-dimensional (2D) map.
- FIG. 1 illustrates one embodiment of a system for generating an optimal view of a 3D building model in accordance with the present disclosure
- FIG. 2 illustrates a diagrammatic representation of a machine in the example form of a computer system in accordance with the present disclosure
- FIGS. 3A and 3B illustrate example embodiments of building models from a three-dimensional building model visualization system in accordance with the present disclosure
- FIG. 4 illustrates an example embodiment of calculating a camera position for a 3D building model in accordance with the present disclosure
- FIG. 5 illustrates a flowchart embodiment of a process for generating optimal views of selected building models in accordance with the present disclosure
- FIG. 6 illustrates a flowchart of an exemplary embodiment for calculating a camera position in accordance with the present disclosure
- FIGS. 7A-7B collectively illustrate examples of optimal views for 3D building models in a 3D building model visualization system in accordance with the present disclosure
- FIG. 8 illustrates a general schematic of a multiple view building model in accordance with the present disclosure.
- FIG. 9 illustrates an embodiment for displaying and navigating a 3D building model visualization system in accordance with the present disclosure.
- One or more embodiments of the technology described herein include a method and system for visualizing and navigating 3D building models. Optimal views of the facades (sides) of 3D building models are automatically generated for display. In another embodiment, the system and method provides for navigation through a 3D map system using one or more optimal views of facades.
- FIG. 1 illustrates one embodiment of a system for generating an optimal view of a 3D building model in accordance with the present disclosure.
- 3D building model visualization system 100 includes view processing system 102 and electronic device 104 coupled via a network channel 106 .
- Network channel 106 is a system for communication.
- Network channel 106 includes, for example, an Ethernet or other wire-based network or a wireless network or wireless adapter for communicating with a wireless network, such as a WI-FI network.
- network channel 106 includes any suitable network for any suitable communication interface.
- network channel 106 can include an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet, cloud based systems or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless.
- network channel 106 can be a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a 3G or 4G network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network).
- WPAN wireless PAN
- WI-FI such as, for example, a BLUETOOTH WPAN
- WI-MAX such as, for example, a WI-MAX network
- 3G or 4G network such as, for example, a Global System for Mobile Communications (GSM) network.
- GSM
- network channel 106 uses standard communications technologies and/or protocols.
- network channel 106 can include links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, LTE, CDMA, digital subscriber line (DSL), etc.
- the networking protocols used on network channel 106 can include multiprotocol label switching (MPLS), the transmission control protocol/Internet protocol (TCP/IP), the User Datagram Protocol (UDP), the hypertext transport protocol (HTTP), the simple mail transfer protocol (SMTP), and the file transfer protocol (FTP).
- the data exchanged over network channel 106 is represented using technologies and/or formats including the hypertext markup language (HTML) and the extensible markup language (XML).
- all or some of links can be encrypted using conventional encryption technologies such as secure sockets layer (SSL), transport layer security (TLS), and Internet Protocol security (IPsec).
- SSL secure sockets layer
- TLS transport layer security
- IPsec Internet Protocol security
- Electronic device 104 is a device for selection and display of a 3D building model.
- electronic device 104 is a computer with a monitor, a laptop, a touch screen display, a smartphone, tablet computer, an LED array, game console, a television set, a projector display, a wearable heads-up display of some sort, or any combination thereof.
- electronic device 104 includes a computer system, such as computer system 200 of FIG. 2 , for processing 3D visual data for display.
- view processing system 102 collects building selection information (e.g., address, geo-location, position or areas within a displayed map, etc.) corresponding to a selected 3D building model from electronic device 104 .
- electronic device 104 directly uploads selection information to view processing system 102 via the network channel 106 , or indirectly uploads the information.
- the information is uploaded to a computer or a server first before being uploaded to view processing system 102 .
- the information is transferred from electronic device 104 to a networked computer, and then the information is transferred to the view processing system 102 .
- FIG. 2 illustrates a diagrammatic representation of a computer system in accordance with the present disclosure.
- computer system 200 includes a machine and a set of instructions for causing the machine to perform any one or more of the methodologies or modules discussed herein.
- Computer system 200 includes a processor, memory, non-volatile memory, and an interface device.
- Various common components e.g., cache memory
- the computer system 200 is intended to illustrate a hardware device on which any of the components depicted in the example of FIGS. 1-2 (and any other components described in this specification) can be implemented.
- Computer system 200 can be of any applicable known or convenient type.
- the components of computer system 200 can be coupled together via a bus or through some other known or convenient device.
- computer system 200 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these.
- SOC system-on-chip
- SBC single-board computer system
- COM computer-on-module
- SOM system-on-module
- computer system 200 may include one or more computer systems; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks.
- one or more computer systems 200 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
- one or more computer systems 200 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
- One or more computer systems 200 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
- the processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor.
- Intel Pentium microprocessor or Motorola power PC microprocessor.
- machine-readable (storage) medium or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
- the memory is coupled to the processor by, for example, a bus.
- the memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM).
- RAM random access memory
- DRAM dynamic RAM
- SRAM static RAM
- the memory can be local, remote, or distributed.
- the bus also couples the processor to the non-volatile memory and drive unit.
- the non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in computer system 200 .
- the non-volatile storage can be local, remote, or distributed.
- the non-volatile memory is optional because systems can be created with all applicable data available in memory.
- a typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
- Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, for large programs, it may not even be possible to store the entire program in the memory. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution.
- a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.”
- a processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
- the bus also couples the processor to the network interface device.
- the interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of computer system 200 .
- the interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g., “direct PC”), or other interfaces for coupling a computer system to other computer systems.
- the interface can include one or more input and/or output devices.
- the I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device.
- the display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device.
- CTR cathode ray tube
- LCD liquid crystal display
- controllers of any devices not depicted reside in the interface.
- computer system 200 can be controlled by operating system software that includes a file management system, such as a disk operating system.
- a file management system such as a disk operating system.
- operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash., and their associated file management systems.
- WindowsTM Windows® from Microsoft Corporation of Redmond, Wash.
- LinuxTM LinuxTM operating system and its associated file management system.
- the file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit.
- an “optimal view” of a 3D building model facade is defined as a preferred view of a 3D building model (e.g., elevated front view) which is generated by a 3D building model visualization system from a bird's eye view/perspective and which calculates and generates an unobstructed or a partially unobstructed view of a 3D building model or group of connected or close proximity (e.g., same city block) models.
- the generated view is provided to a computer system display as a centered image including at least one facade (side) of the 3D building model.
- the 3D building model refers to a textured 3D building model using known texturing techniques
- the generated optimal view of the 3D building model includes visible buffer regions around the centered view of the building (top, bottom, left, right). In one or more embodiments, these buffer regions are white space around a selected 3D building object. In other embodiments, the buffer regions include elements of building models and/or other visible structures located in the vicinity of the selected building model(s) within the 3D map.
- FIGS. 3A and 3B illustrate embodiments of 3D building models from the 3D building model visualization system in accordance with the present disclosure.
- FIG. 3A illustrates one embodiment of a building model where 3D building model 300 is selected to be displayed on electronic device 104 of FIG. 1 .
- the 3D building model is selected on the electronic device using current global positioning system (GPS) information.
- GPS global positioning system
- building model 300 is selected by touching or clicking (e.g., using a mouse/stylus) on a location of a map on the display of electronic device 112 of FIG. 1 or typing in an address of the desired 3D building model.
- electronic device 104 displays a view of building model 300 in the form of a view of building model facade 304 .
- First main axis 301 and second main axis 302 are calculated axes (e.g., following a primary roof line, for example a lengthwise building peak or orthogonal to the other axis) which define the orientation of the building model.
- the disclosed 3D building model visualization system in accordance with the present disclosure, automatically calculates and/or pre-stores previously calculated optimal views for each facade of the selected or stored 3D building models.
- an optimal view for building model facade 304 is calculated based on a defined look angle 303 with respect to calculated main axis 301 and a defined field of view (e.g., defined up/down, left/right angles which define the extent of what is viewed on a 3D map) for the selected building.
- the calculated optimal view is centered on the display screen.
- a buffer zone e.g., white space or elements of surrounding model structures
- a primary optimal view is the optimal view which the 3D building model visualization system ranks as the highest quality optimal view available for display.
- the primary optimal view is a view generated for the front facade of a 3D building model.
- a primary axis is a first main axis perpendicular to the side of the building model which is displayed in the calculated primary optimal view.
- selection of a building model on a 3D map will result in the 3D building model visualization system calculating optimal views for all the sides of a building model and displaying only the primary optimal view of the building model.
- the 3D building model visualization system will provide interface controls to select one or more of the calculated optimal views of the building model.
- the building model facade 304 is facing a main road or street in the 3D building model visualization system and the primary optimal view (typically the front) is selected to be one of the optimal views which face a main street or road path 305 from the 3D building model visualization system.
- FIG. 3B illustrates another embodiment of a building model from the 3D building model visualization system in accordance with the present disclosure.
- building model 300 has a primary optimal view selected on side 307 of the 3D building model which is not facing a road. The optimal view is calculated and displayed based on a defined look angle 306 and second main axis 302 .
- machine learning capable algorithms are used to identify and determine a selected 3D building model's architectural features (e.g., windows, bay windows, doors, columns, railings, etc.) and rank the various facades of a selected 3D building model based on expected architectural features typically located on primary facades (e.g., centered front door, portico, porch, high number of windows, etc.).
- known neural networks or statistics such as archetypal analysis on buildings or imagery are used to determine a pattern as to what feature combinations and layouts generally constitute the primary optimal view of a building model within the 3D building model visualization system.
- optimal views and navigation within the 3D building model visualization system are achieved through pre-calculated, stored building model views and interbuilding navigation routes (e.g. roads, intersections, etc.).
- stored calculated optimal views of the building models are displayed.
- the 3D mapping system routes and views are then navigated in an object-oriented, spatially intuitive (e.g., up, down, left, right) manner.
- the system simplifies industry common 3D navigational approaches typically referred to as “six degrees of freedom” methods that require fine motor adjustments that are not experienced in navigating a 3D mapping system.
- the navigator can navigate in 3D by sequencing through optimal views of one or more building models rather than trying to navigate in 3 dimensions using, for example, a 3D mouse or joystick.
- FIG. 4 illustrates an example embodiment of calculating a camera position for a 3D building model in accordance with the present disclosure.
- camera position 401 is arranged for generating optimal views for one or multiple building models 400 in a 3D building model visualization system.
- projection pyramid 401 having forward direction vector 402 , right vector 403 and up vector 404 projecting from the camera point are used to calculate a camera position. Additional details for calculating a camera position are discussed in FIG. 6 .
- FIG. 5 illustrates an embodiment of a flowchart of the process for generating optimal views of selected 3D building models in accordance with the present disclosure.
- Process 500 is initiated when one or more 3D building models are retrieved from the 3D building model visualization system (step 501 ) based on a selected building location.
- the location is selected using current global positioning system (GPS) information (e.g., a user is standing in front of a building of interest), by touching or clicking on a location of a map, or typing in an address of the desired 3D building model.
- GPS global positioning system
- the 3D building models for the selected location are retrieved from the 3D map system.
- an optimal look angle (downward pitch angle) is defined (step 502 ) for the selected 3D building model.
- an optimal look angle as depicted in FIGS. 3A-3B ( 303 / 306 ) is set as a default downward pitch angle relative to the calculated axis by the 3D building model visualization system (e.g., 10-50 degrees).
- an optimal look angle is selected at or proximate to 20 degrees.
- the optimal look angle is defined to limit obstructions from the view of the 3D building model facades. For example, if a tall structure is in the line of site of the optimal view, the look angle can be raised to look over the obstruction.
- a field of view (extent of the observable view) is set in the next step ( 503 ) once the optimal look angle for the selected 3D building model is defined.
- the field of view represents the selected 3D building model including its surrounding area.
- the field of view may include several feet of surroundings in the front of the 3D building model, the sides of the 3D building model or above the 3D building model.
- the field of view is defined manually by selection of a boundary for a selected 3D building object.
- the 3D building model visualization system defines the field of view automatically, using computer vision algorithms to account for road networks in the vicinity of the selected 3D building model, obstructions from other buildings or structures, relevant faces of the building models, or any other structures or map elements which suggests an initial optimal field of view.
- the 3D building model visualization system provides a default field of view which includes fixed up/down angles (typically 45 degrees) and or left/right angles.
- the first (primary) and second (perpendicular) axes of the selected building models are calculated by the system (step 504 ) in the next step.
- angular thresholds are used to establish groups of like angles and identify the edges of the structure (e.g., 3D building model) in order to determine four sides as well as the orientation of a first optimal view.
- a first edge becomes an initial grouping plane location.
- a weighted edge length is generated that represents a new edge length raised to the power of the influence value.
- the value of the influence value is 3 .
- a main axis is calculated as the average of all of the edge orientations with the angular threshold regardless of their length. The larger that this value is, the more that longer edges and less that shorter edges have influence over the estimate of the axis going through the edges.
- a weighted edge plane is generated for each edge that is the distance of the new edge from the initial edge plane multiplied by the grouping plane normal minus the initial edge plane multiplied by the weighted edge length.
- the weighted length sum is maintained as the sum of the weighted edge lengths.
- the sum of the weighted edge planes is maintained.
- the grouping plane that the edges are adjusted to is maintained as the sum of the weighted edge planes divided by the weighted length sum.
- a camera position is calculated (step 505 ).
- the method of calculating the camera position is discussed with respect to FIG. 6 below.
- an optimal view of the selected 3D building model is generated (step 506 ) by a 3D building model visualization system according to the process and is communicated to the electronic device for display (step 507 ).
- FIG. 6 illustrates a flowchart of an exemplary embodiment for calculating a camera position in accordance with the present disclosure.
- the process involves calculating the camera position as discussed in block 505 of FIG. 5 .
- a projection pyramid 501 as shown in FIG. 4 projecting from the camera is defined (step 601 ).
- a projection pyramid projecting from the camera point is defined manually.
- the projection pyramid projecting 401 from the camera point is set automatically by 3D building model visualization system.
- An offset angle up and down from forward direction vector 402 for the camera are equal, and the offset angles left and right of the forward direction vector 402 are equal. The offset angles are smaller than the defined field of view established in process 500 of FIG. 5 .
- the up and down offset angles from forward direction vector 402 are set to 20 degrees to give a 2.5 degrees vector up and down.
- the left and right offset angles are adjusted based on the aspect ratio of the viewer window.
- the corner vectors defining the up-right, down-right, down-left, and up-left edges projecting from camera, as well as side vectors defining projections along the center-right, center-down, center-left, and center-up are stored by the 3D building model visualization system.
- the side vectors defining projections along the center-right, center-down, center-left, and center-up are stored by the 3D building model visualization system.
- lower optimal views and top optimal views are generated for each selected 3D building model (step 602 ).
- at least one for the positive direction of each axis and at least one for the negative direction of each axis (the primary axis and perpendicular axis) providing four lower optimal views and four top optimal views.
- the system determines and calculates the forward direction vector as well as right and up camera vectors.
- the forward camera direction for lower optimal views is calculated as being in the same vertical plane of the calculated axis and using the defined downward pitch angle.
- the right vector 403 which will be parallel to the right direction of a display, is calculated as being 90 degrees clockwise to the forward camera direction vector and horizontal.
- the up vector 404 which will be parallel to the up direction of the viewer screen, is calculated as being rotated 90 degrees up from the forward direction vector and also as being in the same vertical plane of the calculated axis.
- a projection plane (not shown) is defined as being normal or perpendicular to the camera forward direction vector and going through the vertex that is furthest forward along the camera forward direction vector (step 605 ).
- the projection pyramid vectors are rotated (step 606 ) so that the camera forward direction vector is centered between the up/down and left/right sides and between the vectors that define the pyramid. In another embodiment, the projection pyramid vectors are rotated so that the camera forward direction vector is centered between the up/down and left/right sides that define the pyramid.
- a next step for each vertex in the 3D building model structure selection, lines are projected from the vertex in the direction of each corner pyramid vector are intersected with the projection plane. In one embodiment, if the vertex is below the terrain, the vertex location is snapped vertically to the terrain height, and that position is used instead. In another embodiment, for each vertex in the building model structure selection, lines are projected from the vertex in the direction of each side pyramid vector are intersected with the projection plane. In certain embodiments, if the vertex is below the terrain, the vertex location is snapped vertically to the terrain height, and that position is used as the intersection point instead.
- a next step (step 608 ), the minimum and maximum distances of the intersection points along the camera up and camera right direction vectors from a reference point on the projection plane are stored along with the vertices that the projection points came from. These distances define a bounding box on the projection plane around the vertex line intersections.
- the up and down extreme points are then projected onto a vertical plane normal or perpendicular to the camera right vector while the left and right extreme points are projected onto a plane normal or perpendicular to the camera up vector (blocks 609 and 610 ).
- Convergence points away (back) from the projection plane are then calculated (step 611 ) in two stages.
- a convergence point away from the top and bottom edges of the bounding box is calculated according to the following steps.
- the normal of the top surface of the projection pyramid is calculated as the cross product of the camera right vector and the projection pyramid center-up direction vector.
- a convergence plane is defined as the plane that has a normal the same as the top surface of the projection pyramid and going through the vertex with highest intersection on the projection plane.
- a convergence line is defined as the line with the direction the same as the projection pyramid center-down direction vector and going through the vertex with the lowest intersection on the projection plane.
- the intersection between the convergence plane and the convergence line produces the up/down convergence point.
- a convergence point away from the left and right edges of the bounding box is calculated according to the following steps.
- the normal of the right surface of the projection pyramid is calculated as the cross product of the camera up vector and the projection pyramid center-right direction vector.
- a convergence plane is defined as the plane that has a normal the same as the right surface of the projection pyramid and going through the vertex with furthest right intersection on the projection plane.
- a convergence line is defined as the line with the direction the same as the projection pyramid center-left direction vector and going through the vertex with the furthest left intersection on the projection plane.
- the intersection between the convergence plane and the convergence line to get the left/right convergence point.
- the distance of the camera from the projection plane is found next (step 612 ) by measuring the distance between the up/down convergence point and the projection plane and the distance between the left/right convergence point and the projection plane.
- the camera is positioned back from the center of the projection plane bounding box at the largest convergence point distance from the projection plane.
- the camera position is adjusted for better centering (step 613 ). If the up/down convergence distance from the projection plane is shorter than the left/right convergence distance, the calculations can be iterated with smaller up/down offset angles from the forward vector until the up/down convergence distance substantially matches the left/right convergence distance. If the left/right convergence distance from the projection plane is shorter than the up/down convergence distance, the calculations can be iterated with smaller left/right offset angles from the forward vector until the left/right convergence distance substantially matches the up/down convergence distance.
- FIGS. 7A and 7B illustrate examples of optimal views for 3D building models in the 3D building model visualization system in accordance with the present disclosure.
- FIG. 7A illustrates a screen-shot of multiple building models on a street within the disclosed 3D building model visualization system.
- one or more building models 701 are selected, for example, by typing in the exact address of a building model on the map.
- the address is entered in the 3D building model visualization system in the form of a mailing address, GPS/geographical coordinates of the building, or as a point of interest.
- the 3D building model is selected by clicking on it with a computer mouse, joystick or the like.
- the typing of the address may be done using known computer keyboard devices, voice or hand waving commands or any other known and available technology of typing data into the system.
- the building model is selected within the interface by touching or swiping on a touch-screen display or a capacitive screen display (e.g., using a finger, stylus or the like).
- the 3D building model visualization system calculates or retrieves a pre-calculated primary optimum view 702 of the building model as shown in FIG. 7B .
- FIG. 8 illustrates a general schematic of an embodiment in accordance with the present disclosure.
- the 3D building model visualization system displays the optimal views for a selected 3D building model. Instead of a single view (e.g., the primary optimal view), the 3D building model visualization system displays optimum views 801 through 805 of the building model's facades. Each of the displayed views are optimal views (e.g., views corresponding to downwards look angles as depicted in FIG. 3A and 3B ).
- FIG. 9 illustrates an embodiment for displaying and navigating a 3D building model visualization system in accordance with the present disclosure.
- selection of a building model 900 from the 3D building model visualization system automatically generates and displays optimal views of the building model as fixed perspective views (e.g., front, left, right, top, rear and bottom). While a mouse cursor 901 was used in this figure to show selection of a building model in the 3D building model visualization system, any other known selection devices and methods are also contemplated. Selection of a building model by joy-stick, game controller, touching a touch sensitive screen, typing an address or geographic coordinates, automatic selection using a location, or by using voice commands, eye gazing or hand waving commands produce the same results.
- Interface navigation controls 902 allow for visualization of a building model and/or navigate in the 3D building model visualization system through fixed, optimal perspectives views 903 through 907 of the building model 900 , simplifying the traditional “six degrees of freedom” fine motor adjustments.
- the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, measurements, angles, positions and geo-locations. Such relativity between items ranges from a difference of a few percent to magnitude differences.
- the technology as described herein may have also been described, at least in part, in terms of one or more embodiments.
- An embodiment of the technology as described herein is used herein to illustrate an aspect thereof, a feature thereof, a concept thereof, and/or an example thereof.
- a physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process that embodies the technology described herein may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein.
- the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Computer Hardware Design (AREA)
- Radar, Positioning & Navigation (AREA)
- Computing Systems (AREA)
- Architecture (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- Instructional Devices (AREA)
- Image Processing (AREA)
Abstract
A method and system is provided for automatic generation and navigation of optimal views of facades of multi-dimensional building models. The system and method allows for navigation and visualization of facades of individual or multiple building models in a multi-dimensional building model visualization system.
Description
- The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §120 as a continuation of U.S. Utility application Ser. No. 14/339,992, entitled “METHOD AND SYSTEM FOR DISPLAYING AND NAVIGATING BUILDING FACADES IN A THREE-DIMENSIONAL MAPPING SYSTEM,” filed Jul. 24, 2014, scheduled to issue as U.S. Pat. No. 9,437,044, which claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application No. 61/858,241, entitled “METHOD AND SYSTEM FOR DISPLAYING AND NAVIGATING BUILDING FACADES IN A THREE-DIMENSIONAL MAPPING SYSTEM,” filed Jul. 25, 2013, both of which are hereby incorporated herein by reference in their entirety and made part of the present U.S. Utility Patent Application for all purposes.
- The present Application is related to the following:
- 1. U.S. Utility application Ser. No. 13/624,816, entitled, “THREE-DIMENSIONAL MAP SYSTEM,” filed on Sep. 21, 2012, now U.S. Pat. No. 8,878,865, and
- 2. U.S. Utility application Ser. No. 12/265,656, entitled “METHOD AND SYSTEM FOR GEOMETRY EXTRACTION, 3D VISUALIZATION AND ANALYSIS USING ARBITRARY OBLIQUE IMAGERY,” filed Nov. 5, 2008, now U.S. Pat. No. 8,422,825.
- The technology described herein relates generally to an interactive three-dimensional (3D) map system and, in particular, to a system and method for visualizing 3D building models within an interactive 3D building model visualization system.
- Location-based and mobile technologies are often considered the center of the technology revolution of this century. Essential to these are ways to best present location-based information to electronic devices, particularly mobile devices. The technology used to represent this information has traditionally been based on a two-dimensional (2D) map.
- Efforts have been made to generate a three-dimensional (3D) map of urban cities via aerial imagery or specialized camera-equipped vehicles. 2D maps presenting street level photographs as well as 3D maps displaying 3D building models have emerged as alternatives to classical 2D maps. However, these 2D or 3D maps have limited texture resolution and geometry quality, and do not allow users to display the highest quality, optimal views of selected building models on a map. These efforts also do not integrate street-level imagery into a full 3D scene as they require dual data interfaces between aerial-derived 3D datasets and a disorienting transition to static panoramic street-level imagery, typically collected using vehicles. Ways of resolving these problems have been sought by those in the art, but prior developments have not taught or suggested any viable solutions.
-
FIG. 1 illustrates one embodiment of a system for generating an optimal view of a 3D building model in accordance with the present disclosure; -
FIG. 2 illustrates a diagrammatic representation of a machine in the example form of a computer system in accordance with the present disclosure; -
FIGS. 3A and 3B illustrate example embodiments of building models from a three-dimensional building model visualization system in accordance with the present disclosure; -
FIG. 4 illustrates an example embodiment of calculating a camera position for a 3D building model in accordance with the present disclosure; -
FIG. 5 illustrates a flowchart embodiment of a process for generating optimal views of selected building models in accordance with the present disclosure; -
FIG. 6 illustrates a flowchart of an exemplary embodiment for calculating a camera position in accordance with the present disclosure; -
FIGS. 7A-7B collectively illustrate examples of optimal views for 3D building models in a 3D building model visualization system in accordance with the present disclosure; -
FIG. 8 illustrates a general schematic of a multiple view building model in accordance with the present disclosure; and -
FIG. 9 illustrates an embodiment for displaying and navigating a 3D building model visualization system in accordance with the present disclosure. - One or more embodiments of the technology described herein include a method and system for visualizing and navigating 3D building models. Optimal views of the facades (sides) of 3D building models are automatically generated for display. In another embodiment, the system and method provides for navigation through a 3D map system using one or more optimal views of facades.
-
FIG. 1 illustrates one embodiment of a system for generating an optimal view of a 3D building model in accordance with the present disclosure. In one embodiment, 3D buildingmodel visualization system 100 includesview processing system 102 andelectronic device 104 coupled via anetwork channel 106.Network channel 106 is a system for communication.Network channel 106 includes, for example, an Ethernet or other wire-based network or a wireless network or wireless adapter for communicating with a wireless network, such as a WI-FI network. In other embodiments,network channel 106 includes any suitable network for any suitable communication interface. As an example and not by way of limitation,network channel 106 can include an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet, cloud based systems or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As another example,network channel 106 can be a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a 3G or 4G network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network). - In one embodiment,
network channel 106 uses standard communications technologies and/or protocols. Thus,network channel 106 can include links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, LTE, CDMA, digital subscriber line (DSL), etc. Similarly, the networking protocols used onnetwork channel 106 can include multiprotocol label switching (MPLS), the transmission control protocol/Internet protocol (TCP/IP), the User Datagram Protocol (UDP), the hypertext transport protocol (HTTP), the simple mail transfer protocol (SMTP), and the file transfer protocol (FTP). In one embodiment, the data exchanged overnetwork channel 106 is represented using technologies and/or formats including the hypertext markup language (HTML) and the extensible markup language (XML). In addition, all or some of links can be encrypted using conventional encryption technologies such as secure sockets layer (SSL), transport layer security (TLS), and Internet Protocol security (IPsec). -
Electronic device 104 is a device for selection and display of a 3D building model. For example,electronic device 104 is a computer with a monitor, a laptop, a touch screen display, a smartphone, tablet computer, an LED array, game console, a television set, a projector display, a wearable heads-up display of some sort, or any combination thereof. In one embodiment,electronic device 104 includes a computer system, such ascomputer system 200 ofFIG. 2 , for processing 3D visual data for display. - In one embodiment,
view processing system 102 collects building selection information (e.g., address, geo-location, position or areas within a displayed map, etc.) corresponding to a selected 3D building model fromelectronic device 104. In some embodiments,electronic device 104 directly uploads selection information to viewprocessing system 102 via thenetwork channel 106, or indirectly uploads the information. For example, the information is uploaded to a computer or a server first before being uploaded to viewprocessing system 102. For another example, the information is transferred fromelectronic device 104 to a networked computer, and then the information is transferred to theview processing system 102. -
FIG. 2 illustrates a diagrammatic representation of a computer system in accordance with the present disclosure. In one embodiment,computer system 200 includes a machine and a set of instructions for causing the machine to perform any one or more of the methodologies or modules discussed herein.Computer system 200 includes a processor, memory, non-volatile memory, and an interface device. Various common components (e.g., cache memory) are omitted for illustrative simplicity. Thecomputer system 200 is intended to illustrate a hardware device on which any of the components depicted in the example ofFIGS. 1-2 (and any other components described in this specification) can be implemented.Computer system 200 can be of any applicable known or convenient type. The components ofcomputer system 200 can be coupled together via a bus or through some other known or convenient device. - This disclosure contemplates
computer system 200 taking any suitable physical form. As example and not by way of limitation,computer system 200 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these. Where appropriate,computer system 200 may include one or more computer systems; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one ormore computer systems 200 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one ormore computer systems 200 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One ormore computer systems 200 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate. - The processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor. One of skill in the relevant art will recognize that the terms “machine-readable (storage) medium” or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
- The memory is coupled to the processor by, for example, a bus. The memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM). The memory can be local, remote, or distributed.
- The bus also couples the processor to the non-volatile memory and drive unit. The non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in
computer system 200. The non-volatile storage can be local, remote, or distributed. The non-volatile memory is optional because systems can be created with all applicable data available in memory. A typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor. - Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, for large programs, it may not even be possible to store the entire program in the memory. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution. As used herein, a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.” A processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
- The bus also couples the processor to the network interface device. The interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of
computer system 200. The interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g., “direct PC”), or other interfaces for coupling a computer system to other computer systems. The interface can include one or more input and/or output devices. The I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device. The display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device. For simplicity, it is assumed that controllers of any devices not depicted reside in the interface. - In operation,
computer system 200 can be controlled by operating system software that includes a file management system, such as a disk operating system. One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash., and their associated file management systems. Another example of operating system software with its associated file management system software is the Linux™ operating system and its associated file management system. The file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit. - According to one or more embodiments of the technology described herein, an “optimal view” of a 3D building model facade is defined as a preferred view of a 3D building model (e.g., elevated front view) which is generated by a 3D building model visualization system from a bird's eye view/perspective and which calculates and generates an unobstructed or a partially unobstructed view of a 3D building model or group of connected or close proximity (e.g., same city block) models. The generated view is provided to a computer system display as a centered image including at least one facade (side) of the 3D building model. The 3D building model refers to a textured 3D building model using known texturing techniques
- In one embodiment, the generated optimal view of the 3D building model includes visible buffer regions around the centered view of the building (top, bottom, left, right). In one or more embodiments, these buffer regions are white space around a selected 3D building object. In other embodiments, the buffer regions include elements of building models and/or other visible structures located in the vicinity of the selected building model(s) within the 3D map.
-
FIGS. 3A and 3B illustrate embodiments of 3D building models from the 3D building model visualization system in accordance with the present disclosure.FIG. 3A illustrates one embodiment of a building model where3D building model 300 is selected to be displayed onelectronic device 104 ofFIG. 1 . In one embodiment, the 3D building model is selected on the electronic device using current global positioning system (GPS) information. In alternative embodiments,building model 300 is selected by touching or clicking (e.g., using a mouse/stylus) on a location of a map on the display of electronic device 112 ofFIG. 1 or typing in an address of the desired 3D building model. - In one embodiment,
electronic device 104 displays a view ofbuilding model 300 in the form of a view ofbuilding model facade 304. Firstmain axis 301 and secondmain axis 302 are calculated axes (e.g., following a primary roof line, for example a lengthwise building peak or orthogonal to the other axis) which define the orientation of the building model. As will be described in greater detail hereafter, the disclosed 3D building model visualization system, in accordance with the present disclosure, automatically calculates and/or pre-stores previously calculated optimal views for each facade of the selected or stored 3D building models. In one or more embodiments, an optimal view forbuilding model facade 304 is calculated based on a definedlook angle 303 with respect to calculatedmain axis 301 and a defined field of view (e.g., defined up/down, left/right angles which define the extent of what is viewed on a 3D map) for the selected building. The calculated optimal view is centered on the display screen. A buffer zone (e.g., white space or elements of surrounding model structures) around the facade of the building model is typically included in the displayed view for centering purposes. A primary optimal view is the optimal view which the 3D building model visualization system ranks as the highest quality optimal view available for display. In one or more embodiments, the primary optimal view is a view generated for the front facade of a 3D building model. A primary axis is a first main axis perpendicular to the side of the building model which is displayed in the calculated primary optimal view. In one embodiment, selection of a building model on a 3D map will result in the 3D building model visualization system calculating optimal views for all the sides of a building model and displaying only the primary optimal view of the building model. In other embodiment, the 3D building model visualization system will provide interface controls to select one or more of the calculated optimal views of the building model. In one embodiment, thebuilding model facade 304 is facing a main road or street in the 3D building model visualization system and the primary optimal view (typically the front) is selected to be one of the optimal views which face a main street orroad path 305 from the 3D building model visualization system. -
FIG. 3B illustrates another embodiment of a building model from the 3D building model visualization system in accordance with the present disclosure. In one embodiment,building model 300 has a primary optimal view selected onside 307 of the 3D building model which is not facing a road. The optimal view is calculated and displayed based on a definedlook angle 306 and secondmain axis 302. In one embodiment, machine learning capable algorithms are used to identify and determine a selected 3D building model's architectural features (e.g., windows, bay windows, doors, columns, railings, etc.) and rank the various facades of a selected 3D building model based on expected architectural features typically located on primary facades (e.g., centered front door, portico, porch, high number of windows, etc.). In additional embodiments, known neural networks or statistics such as archetypal analysis on buildings or imagery are used to determine a pattern as to what feature combinations and layouts generally constitute the primary optimal view of a building model within the 3D building model visualization system. - In one or more embodiments, optimal views and navigation within the 3D building model visualization system are achieved through pre-calculated, stored building model views and interbuilding navigation routes (e.g. roads, intersections, etc.). As the 3D map system is navigated, stored calculated optimal views of the building models are displayed. The 3D mapping system routes and views are then navigated in an object-oriented, spatially intuitive (e.g., up, down, left, right) manner. The system simplifies industry common 3D navigational approaches typically referred to as “six degrees of freedom” methods that require fine motor adjustments that are not experienced in navigating a 3D mapping system. In other words, the navigator can navigate in 3D by sequencing through optimal views of one or more building models rather than trying to navigate in 3 dimensions using, for example, a 3D mouse or joystick.
-
FIG. 4 illustrates an example embodiment of calculating a camera position for a 3D building model in accordance with the present disclosure. In one embodiment,camera position 401 is arranged for generating optimal views for one ormultiple building models 400 in a 3D building model visualization system. In one embodiment,projection pyramid 401 havingforward direction vector 402,right vector 403 and upvector 404 projecting from the camera point are used to calculate a camera position. Additional details for calculating a camera position are discussed inFIG. 6 . -
FIG. 5 illustrates an embodiment of a flowchart of the process for generating optimal views of selected 3D building models in accordance with the present disclosure.Process 500 is initiated when one or more 3D building models are retrieved from the 3D building model visualization system (step 501) based on a selected building location. As previously stated, the location is selected using current global positioning system (GPS) information (e.g., a user is standing in front of a building of interest), by touching or clicking on a location of a map, or typing in an address of the desired 3D building model. In one embodiment, the 3D building models for the selected location are retrieved from the 3D map system. - In a next step, an optimal look angle (downward pitch angle) is defined (step 502) for the selected 3D building model. In one embodiment an optimal look angle, as depicted in
FIGS. 3A-3B (303/306) is set as a default downward pitch angle relative to the calculated axis by the 3D building model visualization system (e.g., 10-50 degrees). For example, in one embodiment an optimal look angle is selected at or proximate to 20 degrees. In alternative embodiments, the optimal look angle is defined to limit obstructions from the view of the 3D building model facades. For example, if a tall structure is in the line of site of the optimal view, the look angle can be raised to look over the obstruction. - A field of view (extent of the observable view) is set in the next step (503) once the optimal look angle for the selected 3D building model is defined. The field of view represents the selected 3D building model including its surrounding area. For example, the field of view may include several feet of surroundings in the front of the 3D building model, the sides of the 3D building model or above the 3D building model. In one embodiment, the field of view is defined manually by selection of a boundary for a selected 3D building object. In certain embodiments, the 3D building model visualization system defines the field of view automatically, using computer vision algorithms to account for road networks in the vicinity of the selected 3D building model, obstructions from other buildings or structures, relevant faces of the building models, or any other structures or map elements which suggests an initial optimal field of view. In yet other embodiments, the 3D building model visualization system provides a default field of view which includes fixed up/down angles (typically 45 degrees) and or left/right angles.
- The first (primary) and second (perpendicular) axes of the selected building models are calculated by the system (step 504) in the next step. In one embodiment, angular thresholds are used to establish groups of like angles and identify the edges of the structure (e.g., 3D building model) in order to determine four sides as well as the orientation of a first optimal view. In this exemplary embodiment, a first edge becomes an initial grouping plane location. As edges are added to a specific grouping, a weighted edge length is generated that represents a new edge length raised to the power of the influence value. In one embodiment the value of the influence value is 3. In other embodiments, if the influence value is zero, then a main axis is calculated as the average of all of the edge orientations with the angular threshold regardless of their length. The larger that this value is, the more that longer edges and less that shorter edges have influence over the estimate of the axis going through the edges. A weighted edge plane is generated for each edge that is the distance of the new edge from the initial edge plane multiplied by the grouping plane normal minus the initial edge plane multiplied by the weighted edge length. The weighted length sum is maintained as the sum of the weighted edge lengths. The sum of the weighted edge planes is maintained. The grouping plane that the edges are adjusted to is maintained as the sum of the weighted edge planes divided by the weighted length sum. While this method of establishing groups of like angles reflects an exemplary embodiment, other methods of recognizing like angles are envisioned without departing from the scope of the technology described herein.
- In a next step, a camera position is calculated (step505). The method of calculating the camera position is discussed with respect to
FIG. 6 below. In a final step, an optimal view of the selected 3D building model is generated (step 506) by a 3D building model visualization system according to the process and is communicated to the electronic device for display (step 507). -
FIG. 6 illustrates a flowchart of an exemplary embodiment for calculating a camera position in accordance with the present disclosure. The process involves calculating the camera position as discussed inblock 505 ofFIG. 5 . In a first step, aprojection pyramid 501 as shown inFIG. 4 projecting from the camera is defined (step 601). In one embodiment, a projection pyramid projecting from the camera point is defined manually. Yet in other embodiments, the projection pyramid projecting 401 from the camera point is set automatically by 3D building model visualization system. An offset angle up and down fromforward direction vector 402 for the camera are equal, and the offset angles left and right of theforward direction vector 402 are equal. The offset angles are smaller than the defined field of view established inprocess 500 ofFIG. 5 . In one or more embodiments, by default, the up and down offset angles fromforward direction vector 402 are set to 20 degrees to give a 2.5 degrees vector up and down. The left and right offset angles are adjusted based on the aspect ratio of the viewer window. In one embodiment, the corner vectors defining the up-right, down-right, down-left, and up-left edges projecting from camera, as well as side vectors defining projections along the center-right, center-down, center-left, and center-up are stored by the 3D building model visualization system. In another embodiment, the side vectors defining projections along the center-right, center-down, center-left, and center-up are stored by the 3D building model visualization system. - In a next step, lower optimal views and top optimal views are generated for each selected 3D building model (step 602). In an example embodiment, at least one for the positive direction of each axis and at least one for the negative direction of each axis (the primary axis and perpendicular axis) providing four lower optimal views and four top optimal views.
- In another step, to define the camera vectors (step 603), the system determines and calculates the forward direction vector as well as right and up camera vectors. The forward camera direction for lower optimal views is calculated as being in the same vertical plane of the calculated axis and using the defined downward pitch angle. The
right vector 403, which will be parallel to the right direction of a display, is calculated as being 90 degrees clockwise to the forward camera direction vector and horizontal. The upvector 404, which will be parallel to the up direction of the viewer screen, is calculated as being rotated 90 degrees up from the forward direction vector and also as being in the same vertical plane of the calculated axis. - In a next step (step 604), the vertex in the building model structure selection that is furthest forward along the camera
forward direction vector 402 is found and a projection plane (not shown) is defined as being normal or perpendicular to the camera forward direction vector and going through the vertex that is furthest forward along the camera forward direction vector (step 605). - In one embodiment, the projection pyramid vectors are rotated (step 606) so that the camera forward direction vector is centered between the up/down and left/right sides and between the vectors that define the pyramid. In another embodiment, the projection pyramid vectors are rotated so that the camera forward direction vector is centered between the up/down and left/right sides that define the pyramid.
- In one embodiment, in a next step (step 607), for each vertex in the 3D building model structure selection, lines are projected from the vertex in the direction of each corner pyramid vector are intersected with the projection plane. In one embodiment, if the vertex is below the terrain, the vertex location is snapped vertically to the terrain height, and that position is used instead. In another embodiment, for each vertex in the building model structure selection, lines are projected from the vertex in the direction of each side pyramid vector are intersected with the projection plane. In certain embodiments, if the vertex is below the terrain, the vertex location is snapped vertically to the terrain height, and that position is used as the intersection point instead.
- In a next step (step 608), the minimum and maximum distances of the intersection points along the camera up and camera right direction vectors from a reference point on the projection plane are stored along with the vertices that the projection points came from. These distances define a bounding box on the projection plane around the vertex line intersections.
- The up and down extreme points are then projected onto a vertical plane normal or perpendicular to the camera right vector while the left and right extreme points are projected onto a plane normal or perpendicular to the camera up vector (
blocks 609 and 610). - Convergence points away (back) from the projection plane are then calculated (step 611) in two stages. In a first stage, a convergence point away from the top and bottom edges of the bounding box is calculated according to the following steps. In a first step, the normal of the top surface of the projection pyramid is calculated as the cross product of the camera right vector and the projection pyramid center-up direction vector. In a second step, a convergence plane is defined as the plane that has a normal the same as the top surface of the projection pyramid and going through the vertex with highest intersection on the projection plane. In a third step, a convergence line is defined as the line with the direction the same as the projection pyramid center-down direction vector and going through the vertex with the lowest intersection on the projection plane. In a final step, the intersection between the convergence plane and the convergence line produces the up/down convergence point.
- In a second stage, a convergence point away from the left and right edges of the bounding box is calculated according to the following steps. In a first step, the normal of the right surface of the projection pyramid is calculated as the cross product of the camera up vector and the projection pyramid center-right direction vector. In a second step, a convergence plane is defined as the plane that has a normal the same as the right surface of the projection pyramid and going through the vertex with furthest right intersection on the projection plane. In a third step, a convergence line is defined as the line with the direction the same as the projection pyramid center-left direction vector and going through the vertex with the furthest left intersection on the projection plane. In a fourth step, the intersection between the convergence plane and the convergence line to get the left/right convergence point.
- The distance of the camera from the projection plane is found next (step 612) by measuring the distance between the up/down convergence point and the projection plane and the distance between the left/right convergence point and the projection plane. The camera is positioned back from the center of the projection plane bounding box at the largest convergence point distance from the projection plane.
- In one or more embodiments, the camera position is adjusted for better centering (step 613). If the up/down convergence distance from the projection plane is shorter than the left/right convergence distance, the calculations can be iterated with smaller up/down offset angles from the forward vector until the up/down convergence distance substantially matches the left/right convergence distance. If the left/right convergence distance from the projection plane is shorter than the up/down convergence distance, the calculations can be iterated with smaller left/right offset angles from the forward vector until the left/right convergence distance substantially matches the up/down convergence distance.
-
FIGS. 7A and 7B illustrate examples of optimal views for 3D building models in the 3D building model visualization system in accordance with the present disclosure.FIG. 7A illustrates a screen-shot of multiple building models on a street within the disclosed 3D building model visualization system. In one embodiment, one ormore building models 701 are selected, for example, by typing in the exact address of a building model on the map. In one embodiment the address is entered in the 3D building model visualization system in the form of a mailing address, GPS/geographical coordinates of the building, or as a point of interest. In other embodiments, the 3D building model is selected by clicking on it with a computer mouse, joystick or the like. The typing of the address may be done using known computer keyboard devices, voice or hand waving commands or any other known and available technology of typing data into the system. In other embodiments, the building model is selected within the interface by touching or swiping on a touch-screen display or a capacitive screen display (e.g., using a finger, stylus or the like). In one embodiment, upon selection of a 3D building model (e.g., structure or object), the 3D building model visualization system calculates or retrieves a pre-calculated primaryoptimum view 702 of the building model as shown inFIG. 7B . -
FIG. 8 illustrates a general schematic of an embodiment in accordance with the present disclosure. In one embodiment, the 3D building model visualization system displays the optimal views for a selected 3D building model. Instead of a single view (e.g., the primary optimal view), the 3D building model visualization system displaysoptimum views 801 through 805 of the building model's facades. Each of the displayed views are optimal views (e.g., views corresponding to downwards look angles as depicted inFIG. 3A and 3B ). -
FIG. 9 illustrates an embodiment for displaying and navigating a 3D building model visualization system in accordance with the present disclosure. In a first embodiment, selection of abuilding model 900 from the 3D building model visualization system automatically generates and displays optimal views of the building model as fixed perspective views (e.g., front, left, right, top, rear and bottom). While amouse cursor 901 was used in this figure to show selection of a building model in the 3D building model visualization system, any other known selection devices and methods are also contemplated. Selection of a building model by joy-stick, game controller, touching a touch sensitive screen, typing an address or geographic coordinates, automatic selection using a location, or by using voice commands, eye gazing or hand waving commands produce the same results. Interface navigation controls 902 (e.g., up, left, right and down arrows) allow for visualization of a building model and/or navigate in the 3D building model visualization system through fixed, optimal perspectives views 903 through 907 of thebuilding model 900, simplifying the traditional “six degrees of freedom” fine motor adjustments. - Throughout the specification, drawings and claims various terminology is used to describe the various embodiments. As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, measurements, angles, positions and geo-locations. Such relativity between items ranges from a difference of a few percent to magnitude differences.
- The technology as described herein may have also been described, at least in part, in terms of one or more embodiments. An embodiment of the technology as described herein is used herein to illustrate an aspect thereof, a feature thereof, a concept thereof, and/or an example thereof. A physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process that embodies the technology described herein may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein. Further, from figure to figure, the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
- While particular combinations of various functions and features of the technology as described herein have been expressly described herein, other combinations of these features and functions are likewise possible. For example, the steps may be completed in varied sequences to complete the textured facades. The technology as described herein is not limited by the particular examples disclosed herein and expressly incorporates these other combinations.
Claims (20)
1. A method of visualizing a multi-dimensional building model on a computer display, the method comprises:
retrieving a multi-dimensional building model from computer memory;
defining a look angle and field of view for the multi-dimensional building model, the defining a look angle including determining up, down, left and right angles which define an extent of the multi-dimensional building model to be viewed on the computer display;
calculating a first main axis and a second main axis for the retrieved multi-dimensional building model to define an orientation of the multi-dimensional building model;
calculating, for at least a selected facade, an optimal camera position for the retrieved multi-dimensional building model based on the defined look angle, the defined field of view and the orientation, the optimal camera position defining an optimal view including a centered and at least partially unobstructed view of the retrieved multi-dimensional building model; and
displaying the optimal view on the computer display.
2. The method of claim 1 , wherein the method is repeated for one or more facades of the multi-dimensional building model.
3. The method of claim 1 , wherein the optimal view includes an unobstructed view.
4. The method of claim 1 , wherein the optimal view includes: a group of connected multi-dimensional building models or a group of close proximity multi-dimensional building models.
5. The method of claim 1 further comprising adding a visible buffer space around the optimal view.
6. The method of claim 5 , wherein the optimal view includes visible buffer regions around a centered view of the multi-dimensional building model.
7. The method of claim 6 , wherein the visible buffer regions comprise any of: white space, elements of the same or different multi-dimensional building models, or other visible structures located in a vicinity of the retrieved multi-dimensional building model within a multi-dimensional map.
8. The method of claim 1 , wherein the defined look angle is a downwards looking pitch angle.
9. The method of claim 8 , wherein the downwards looking pitch angle is approximately 20 degrees.
10. The method of claim 1 , wherein the retrieving a multi-dimensional building model from computer memory is based on building selection information including any of: an address, a geo-location, a position or areas within a displayed map.
11. The method of claim 1 , wherein the retrieving a multi-dimensional building model from computer memory building model visualization system pre-stores previously calculated ones of the centered at least partially unobstructed optimal views for each facade.
12. A system for visualizing a multi-dimensional building model, the system comprising:
a view processor for processing an optimal view of a multi-dimensional building model, the processing comprising:
retrieving the multi-dimensional building model from computer memory;
defining a look angle and field of view for the multi-dimensional building models, the defining a look angle including determining up, down, left and right angles which define an extent of the multi-dimensional building model to be viewed on a computer display;
calculating a first main axis and a second main axis for the multi-dimensional building models to define an orientation of the multi-dimensional building model;
calculating, for at least a selected facade, an optimal camera position for the multi-dimensional building model based on the defined look angle, the defined field of view and the orientation, the optimal camera position defining the optimal view as a centered and at least partially unobstructed view of the multi-dimensional building model; and
communicating the optimal view to a remote electronic device to be displayed on the computer display of the remote electronic device.
13. The system according to claim 12 , wherein the view processor processes the optimal view for one or more facades of the multi-dimensional building model.
14. The system according to claim 12 , wherein the optimal view for one or more selected facades of the optimal view multi-dimensional building model is stored in memory for future display or multi-dimensional navigation.
15. A method of visualizing a multi-dimensional building model on a mobile electronic device display, the method comprises:
selecting a multi-dimensional building model based on a location of the multi-dimensional building model;
wirelessly retrieving, from remote computer memory, a previously stored optimal view of the selected multi-dimensional building model, the optimal view based on at least an optimal camera position for the selected multi-dimensional building model, the optimal camera position based on a look angle, a field of view and an orientation of the selected multi-dimensional building model, wherein the optimal view includes a centered at least partially unobstructed view of the selected multi-dimensional building model; and
displaying the optimal view on the mobile electronic device display.
16. The method of claim 15 , wherein the optimal view is used to navigate within a multi-dimensional map at least partially displayed on the mobile electronic device display.
17. The method of claim 15 , wherein the look angle for the optimal view is a downwards looking pitch angle.
18. The method of claim 17 , wherein the downwards looking pitch angle is 10-50 degrees.
19. The method of claim 15 , wherein the optimal view includes visible buffer regions around a centered view of the multi-dimensional building model.
20. The method of claim 19 , wherein the visible buffer regions comprise any of: white space, elements of the same or different multi-dimensional building models, or other visible structures located in a vicinity of the selected multi-dimensional building model(s) within a multi-dimensional map.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/255,952 US20160371882A1 (en) | 2008-11-05 | 2016-09-02 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US15/721,062 US10127721B2 (en) | 2013-07-25 | 2017-09-29 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US16/186,163 US10657714B2 (en) | 2013-07-25 | 2018-11-09 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US16/848,844 US10977862B2 (en) | 2013-07-25 | 2020-04-15 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US17/202,578 US11783543B2 (en) | 2013-07-25 | 2021-03-16 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US18/353,008 US20230360330A1 (en) | 2013-07-25 | 2023-07-14 | Method and system for displaying and navigating an optimal multi-dimensional building model |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/265,656 US8422825B1 (en) | 2008-11-05 | 2008-11-05 | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US201361858241P | 2013-07-25 | 2013-07-25 | |
US14/339,992 US9437044B2 (en) | 2008-11-05 | 2014-07-24 | Method and system for displaying and navigating building facades in a three-dimensional mapping system |
US15/255,952 US20160371882A1 (en) | 2008-11-05 | 2016-09-02 | Method and system for displaying and navigating an optimal multi-dimensional building model |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/339,992 Continuation US9437044B2 (en) | 2008-11-05 | 2014-07-24 | Method and system for displaying and navigating building facades in a three-dimensional mapping system |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/721,062 Continuation-In-Part US10127721B2 (en) | 2013-07-25 | 2017-09-29 | Method and system for displaying and navigating an optimal multi-dimensional building model |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160371882A1 true US20160371882A1 (en) | 2016-12-22 |
Family
ID=48049205
Family Applications (15)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/265,656 Active 2031-08-06 US8422825B1 (en) | 2008-11-05 | 2008-11-05 | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US13/858,707 Active US8649632B2 (en) | 2008-11-05 | 2013-04-08 | System and method for correlating oblique images to 3D building models |
US14/164,508 Active 2029-01-12 US9430871B2 (en) | 2008-11-05 | 2014-01-27 | Method of generating three-dimensional (3D) models using ground based oblique imagery |
US14/168,149 Abandoned US20140320485A1 (en) | 2008-11-05 | 2014-01-30 | System for generating geocoded three-dimensional (3d) models |
US14/339,127 Active 2034-12-18 US9437033B2 (en) | 2008-11-05 | 2014-07-23 | Generating 3D building models with ground level and orthogonal images |
US15/255,952 Abandoned US20160371882A1 (en) | 2008-11-05 | 2016-09-02 | Method and system for displaying and navigating an optimal multi-dimensional building model |
US15/255,807 Active US10776999B2 (en) | 2008-11-05 | 2016-09-02 | Generating multi-dimensional building models with ground level images |
US16/544,327 Active US10643380B2 (en) | 2008-11-05 | 2019-08-19 | Generating multi-dimensional building models with ground level images |
US16/832,403 Active US10769847B2 (en) | 2008-11-05 | 2020-03-27 | Systems and methods for generating planar geometry |
US16/990,453 Active US11113877B2 (en) | 2008-11-05 | 2020-08-11 | Systems and methods for generating three dimensional geometry |
US17/396,255 Active US11741667B2 (en) | 2008-11-05 | 2021-08-06 | Systems and methods for generating three dimensional geometry |
US17/826,067 Active US11574441B2 (en) | 2008-11-05 | 2022-05-26 | Systems and methods for generating three dimensional geometry |
US17/826,085 Active US11574442B2 (en) | 2008-11-05 | 2022-05-26 | Systems and methods for generating three dimensional geometry |
US18/351,350 Pending US20230360326A1 (en) | 2008-11-05 | 2023-07-12 | Systems and methods for generating three dimensional geometry |
US18/351,378 Pending US20230377261A1 (en) | 2008-11-05 | 2023-07-12 | Systems and methods for generating three dimensional geometry |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/265,656 Active 2031-08-06 US8422825B1 (en) | 2008-11-05 | 2008-11-05 | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US13/858,707 Active US8649632B2 (en) | 2008-11-05 | 2013-04-08 | System and method for correlating oblique images to 3D building models |
US14/164,508 Active 2029-01-12 US9430871B2 (en) | 2008-11-05 | 2014-01-27 | Method of generating three-dimensional (3D) models using ground based oblique imagery |
US14/168,149 Abandoned US20140320485A1 (en) | 2008-11-05 | 2014-01-30 | System for generating geocoded three-dimensional (3d) models |
US14/339,127 Active 2034-12-18 US9437033B2 (en) | 2008-11-05 | 2014-07-23 | Generating 3D building models with ground level and orthogonal images |
Family Applications After (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/255,807 Active US10776999B2 (en) | 2008-11-05 | 2016-09-02 | Generating multi-dimensional building models with ground level images |
US16/544,327 Active US10643380B2 (en) | 2008-11-05 | 2019-08-19 | Generating multi-dimensional building models with ground level images |
US16/832,403 Active US10769847B2 (en) | 2008-11-05 | 2020-03-27 | Systems and methods for generating planar geometry |
US16/990,453 Active US11113877B2 (en) | 2008-11-05 | 2020-08-11 | Systems and methods for generating three dimensional geometry |
US17/396,255 Active US11741667B2 (en) | 2008-11-05 | 2021-08-06 | Systems and methods for generating three dimensional geometry |
US17/826,067 Active US11574441B2 (en) | 2008-11-05 | 2022-05-26 | Systems and methods for generating three dimensional geometry |
US17/826,085 Active US11574442B2 (en) | 2008-11-05 | 2022-05-26 | Systems and methods for generating three dimensional geometry |
US18/351,350 Pending US20230360326A1 (en) | 2008-11-05 | 2023-07-12 | Systems and methods for generating three dimensional geometry |
US18/351,378 Pending US20230377261A1 (en) | 2008-11-05 | 2023-07-12 | Systems and methods for generating three dimensional geometry |
Country Status (1)
Country | Link |
---|---|
US (15) | US8422825B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180033198A1 (en) * | 2016-07-29 | 2018-02-01 | Microsoft Technology Licensing, Llc | Forward direction determination for augmented reality and virtual reality |
US20180182168A1 (en) * | 2015-09-02 | 2018-06-28 | Thomson Licensing | Method, apparatus and system for facilitating navigation in an extended scene |
US20180268614A1 (en) * | 2017-03-16 | 2018-09-20 | General Electric Company | Systems and methods for aligning pmi object on a model |
US10878138B2 (en) | 2017-02-23 | 2020-12-29 | Mitek Holdings, Inc. | Method of managing proxy objects |
US11514644B2 (en) | 2018-01-19 | 2022-11-29 | Enphase Energy, Inc. | Automated roof surface measurement from combined aerial LiDAR data and imagery |
Families Citing this family (141)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8422825B1 (en) * | 2008-11-05 | 2013-04-16 | Hover Inc. | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US9536348B2 (en) * | 2009-06-18 | 2017-01-03 | Honeywell International Inc. | System and method for displaying video surveillance fields of view limitations |
US9104695B1 (en) | 2009-07-27 | 2015-08-11 | Palantir Technologies, Inc. | Geotagging structured data |
FR2954520B1 (en) * | 2009-12-18 | 2012-09-21 | Thales Sa | METHOD FOR THE DESIGNATION OF A TARGET FOR A TERMINAL IMAGING GUIDED ARMING |
US8963943B2 (en) * | 2009-12-18 | 2015-02-24 | Electronics And Telecommunications Research Institute | Three-dimensional urban modeling apparatus and method |
WO2011098274A1 (en) * | 2010-02-12 | 2011-08-18 | Marquardt Mechatronik Gmbh | Method for measuring a position |
JP5223062B2 (en) * | 2010-03-11 | 2013-06-26 | 株式会社ジオ技術研究所 | 3D map drawing system |
US8823732B2 (en) * | 2010-12-17 | 2014-09-02 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
JP5775354B2 (en) | 2011-04-28 | 2015-09-09 | 株式会社トプコン | Takeoff and landing target device and automatic takeoff and landing system |
JP5882693B2 (en) | 2011-11-24 | 2016-03-09 | 株式会社トプコン | Aerial photography imaging method and aerial photography imaging apparatus |
EP2527787B1 (en) * | 2011-05-23 | 2019-09-11 | Kabushiki Kaisha TOPCON | Aerial photograph image pickup method and aerial photograph image pickup apparatus |
US8878865B2 (en) * | 2011-09-21 | 2014-11-04 | Hover, Inc. | Three-dimensional map system |
GB201116438D0 (en) * | 2011-09-22 | 2011-11-02 | Advanced Risc Mach Ltd | Occlusion queries in graphics processing |
US9639757B2 (en) * | 2011-09-23 | 2017-05-02 | Corelogic Solutions, Llc | Building footprint extraction apparatus, method and computer program product |
US8630805B2 (en) * | 2011-10-20 | 2014-01-14 | Robert Bosch Gmbh | Methods and systems for creating maps with radar-optical imaging fusion |
US9002114B2 (en) | 2011-12-08 | 2015-04-07 | The Nielsen Company (Us), Llc | Methods, apparatus, and articles of manufacture to measure geographical features using an image of a geographical location |
KR101873525B1 (en) * | 2011-12-08 | 2018-07-03 | 삼성전자 주식회사 | Device and method for displaying a contents in wireless terminal |
US20130201339A1 (en) * | 2012-02-08 | 2013-08-08 | Honeywell International Inc. | System and method of optimal video camera placement and configuration |
US9378509B2 (en) | 2012-05-09 | 2016-06-28 | The Nielsen Company (Us), Llc | Methods, apparatus, and articles of manufacture to measure geographical features using an image of a geographical location |
US9404751B2 (en) * | 2012-06-06 | 2016-08-02 | Samsung Electronics Co., Ltd. | Apparatus and method for providing 3D map showing area of interest in real time |
DE112012006746T5 (en) * | 2012-07-30 | 2015-05-21 | Mitsubishi Electric Corporation | Map display device |
JP6122591B2 (en) | 2012-08-24 | 2017-04-26 | 株式会社トプコン | Photogrammetry camera and aerial photography equipment |
WO2014055953A2 (en) * | 2012-10-05 | 2014-04-10 | Eagle View Technologies, Inc. | Systems and methods for relating images to each other by determining transforms without using image acquisition metadata |
JP6055274B2 (en) | 2012-10-31 | 2016-12-27 | 株式会社トプコン | Aerial photograph measuring method and aerial photograph measuring system |
US9501507B1 (en) | 2012-12-27 | 2016-11-22 | Palantir Technologies Inc. | Geo-temporal indexing and searching |
US9159164B2 (en) | 2013-01-31 | 2015-10-13 | Eagle View Technologies, Inc. | Statistical point pattern matching technique |
US9147287B2 (en) | 2013-01-31 | 2015-09-29 | Eagle View Technologies, Inc. | Statistical point pattern matching technique |
US9380431B1 (en) | 2013-01-31 | 2016-06-28 | Palantir Technologies, Inc. | Use of teams in a mobile application |
US9082014B2 (en) | 2013-03-14 | 2015-07-14 | The Nielsen Company (Us), Llc | Methods and apparatus to estimate demography based on aerial images |
US8855999B1 (en) | 2013-03-15 | 2014-10-07 | Palantir Technologies Inc. | Method and system for generating a parser and parsing complex data |
US8903717B2 (en) | 2013-03-15 | 2014-12-02 | Palantir Technologies Inc. | Method and system for generating a parser and parsing complex data |
US8930897B2 (en) | 2013-03-15 | 2015-01-06 | Palantir Technologies Inc. | Data integration tool |
US8799799B1 (en) | 2013-05-07 | 2014-08-05 | Palantir Technologies Inc. | Interactive geospatial map |
US9041708B2 (en) * | 2013-07-23 | 2015-05-26 | Palantir Technologies, Inc. | Multiple viewshed analysis |
US20160203638A1 (en) * | 2013-08-26 | 2016-07-14 | Sculpteo | Method for displaying section views of a 3d model using a fragment shader |
US8938686B1 (en) | 2013-10-03 | 2015-01-20 | Palantir Technologies Inc. | Systems and methods for analyzing performance of an entity |
US8924872B1 (en) | 2013-10-18 | 2014-12-30 | Palantir Technologies Inc. | Overview user interface of emergency call data of a law enforcement agency |
US9021384B1 (en) | 2013-11-04 | 2015-04-28 | Palantir Technologies Inc. | Interactive vehicle information map |
US8868537B1 (en) | 2013-11-11 | 2014-10-21 | Palantir Technologies, Inc. | Simple web search |
CN103699900B (en) * | 2014-01-03 | 2016-10-05 | 西北工业大学 | Building horizontal vector profile automatic batch extracting method in satellite image |
CA3161755A1 (en) | 2014-01-10 | 2015-07-16 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US11314905B2 (en) | 2014-02-11 | 2022-04-26 | Xactware Solutions, Inc. | System and method for generating computerized floor plans |
US9727376B1 (en) | 2014-03-04 | 2017-08-08 | Palantir Technologies, Inc. | Mobile tasks |
KR101429172B1 (en) * | 2014-04-11 | 2014-08-13 | 대한민국 | Method and device for determining position of object by using image acquired from camera, and computer-readable recording media using the same |
US9972121B2 (en) * | 2014-04-22 | 2018-05-15 | Google Llc | Selecting time-distributed panoramic images for display |
US9996636B2 (en) * | 2014-05-13 | 2018-06-12 | Atheer, Inc. | Method for forming walls to align 3D objects in 2D environment |
US9129219B1 (en) | 2014-06-30 | 2015-09-08 | Palantir Technologies, Inc. | Crime risk forecasting |
US10412594B2 (en) | 2014-07-31 | 2019-09-10 | At&T Intellectual Property I, L.P. | Network planning tool support for 3D data |
US11182712B2 (en) | 2014-09-26 | 2021-11-23 | The Sherwin-Williams Company | System and method for determining coating requirements |
CN104318513A (en) * | 2014-09-29 | 2015-01-28 | 陈奕 | Building three-dimensional image display platform and application system thereof |
CN104504748B (en) * | 2014-12-03 | 2017-09-19 | 中国科学院遥感与数字地球研究所 | A kind of infrared 3-D imaging system of unmanned plane oblique photograph and modeling method |
CN104463970B (en) * | 2014-12-24 | 2017-05-24 | 中国科学院地理科学与资源研究所 | Method for determining three-dimensional gravity center of city based on remote-sensing image and application thereof |
US10372879B2 (en) | 2014-12-31 | 2019-08-06 | Palantir Technologies Inc. | Medical claims lead summary report generation |
US9811889B2 (en) * | 2014-12-31 | 2017-11-07 | Nokia Technologies Oy | Method, apparatus and computer program product for generating unobstructed object views |
CN104634324B (en) * | 2015-02-06 | 2017-09-05 | 北京林业大学 | A kind of arbitrarily photograph single-phase coordinates the technical method of the emergent regional space state of DTM positioning |
EP3845426A1 (en) | 2015-02-10 | 2021-07-07 | Mobileye Vision Technologies Ltd. | Sparse map for autonomous vehicle navigation |
US9891808B2 (en) | 2015-03-16 | 2018-02-13 | Palantir Technologies Inc. | Interactive user interfaces for location-based data analysis |
EP3271900A4 (en) | 2015-03-17 | 2019-03-06 | Environmental Systems Research Institute, Inc. | Interactive dimensioning of parametric models |
US10878278B1 (en) * | 2015-05-16 | 2020-12-29 | Sturfee, Inc. | Geo-localization based on remotely sensed visual features |
US9460175B1 (en) | 2015-06-03 | 2016-10-04 | Palantir Technologies Inc. | Server implemented geographic information system with graphical interface |
US9600146B2 (en) | 2015-08-17 | 2017-03-21 | Palantir Technologies Inc. | Interactive geospatial map |
US10706434B1 (en) | 2015-09-01 | 2020-07-07 | Palantir Technologies Inc. | Methods and systems for determining location information |
US9639580B1 (en) | 2015-09-04 | 2017-05-02 | Palantir Technologies, Inc. | Computer-implemented systems and methods for data management and visualization |
CN105184859B (en) * | 2015-09-22 | 2018-09-07 | 国网上海市电力公司 | A kind of substation's three-dimensional modeling method based on laser scanning |
US10885097B2 (en) | 2015-09-25 | 2021-01-05 | The Nielsen Company (Us), Llc | Methods and apparatus to profile geographic areas of interest |
JP6567940B2 (en) * | 2015-10-05 | 2019-08-28 | 株式会社小松製作所 | Construction management system |
DE102015120927A1 (en) * | 2015-12-02 | 2017-06-08 | Krauss-Maffei Wegmann Gmbh & Co. Kg | Method for displaying a simulation environment |
WO2017100658A1 (en) * | 2015-12-09 | 2017-06-15 | Xactware Solutions, Inc. | System and method for generating computerized models of structures using geometry extraction and reconstruction techniques |
US10430961B2 (en) | 2015-12-16 | 2019-10-01 | Objectvideo Labs, Llc | Using satellite imagery to enhance a 3D surface model of a real world cityscape |
US10109094B2 (en) | 2015-12-21 | 2018-10-23 | Palantir Technologies Inc. | Interface to index and display geospatial data |
CA3012049A1 (en) | 2016-01-20 | 2017-07-27 | Ez3D, Llc | System and method for structural inspection and construction estimation using an unmanned aerial vehicle |
US10867328B2 (en) * | 2016-05-03 | 2020-12-15 | Yembo, Inc. | Systems and methods for providing AI-based cost estimates for services |
US10068199B1 (en) | 2016-05-13 | 2018-09-04 | Palantir Technologies Inc. | System to catalogue tracking data |
CN109643125B (en) * | 2016-06-28 | 2022-11-15 | 柯尼亚塔有限公司 | Realistic 3D virtual world creation and simulation for training an autonomous driving system |
US10365658B2 (en) | 2016-07-21 | 2019-07-30 | Mobileye Vision Technologies Ltd. | Systems and methods for aligning crowdsourced sparse map data |
US9686357B1 (en) | 2016-08-02 | 2017-06-20 | Palantir Technologies Inc. | Mapping content delivery |
US10437840B1 (en) | 2016-08-19 | 2019-10-08 | Palantir Technologies Inc. | Focused probabilistic entity resolution from multiple data sources |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US10127670B2 (en) | 2016-09-27 | 2018-11-13 | Xactware Solutions, Inc. | Computer vision systems and methods for detecting and modeling features of structures in images |
CN106600675A (en) * | 2016-12-07 | 2017-04-26 | 西安蒜泥电子科技有限责任公司 | Point cloud synthesis method based on constraint of depth map |
US10515433B1 (en) | 2016-12-13 | 2019-12-24 | Palantir Technologies Inc. | Zoom-adaptive data granularity to achieve a flexible high-performance interface for a geospatial mapping system |
US10270727B2 (en) | 2016-12-20 | 2019-04-23 | Palantir Technologies, Inc. | Short message communication within a mobile graphical map |
US10460602B1 (en) | 2016-12-28 | 2019-10-29 | Palantir Technologies Inc. | Interactive vehicle information mapping system |
US10928202B2 (en) | 2016-12-30 | 2021-02-23 | Geo-Comm Inc. | System and methods for three-dimensional volumetric indoor location geocoding |
JP6470323B2 (en) * | 2017-01-23 | 2019-02-13 | ファナック株式会社 | Information display system |
CN108537891A (en) * | 2017-03-01 | 2018-09-14 | 黎志毅 | The method that three-dimensional material and textures data are automatically switched to UE4 |
US10579239B1 (en) | 2017-03-23 | 2020-03-03 | Palantir Technologies Inc. | Systems and methods for production and display of dynamically linked slide presentations |
US11046430B1 (en) * | 2017-04-17 | 2021-06-29 | United States Of America As Represented By The Administrator Of Nasa | Intelligent trajectory adviser system for unmanned aerial vehicles in complex environments |
EP3580690B1 (en) | 2017-05-24 | 2020-09-02 | Google LLC | Bayesian methodology for geospatial object/characteristic detection |
US10895946B2 (en) | 2017-05-30 | 2021-01-19 | Palantir Technologies Inc. | Systems and methods for using tiled data |
US11334216B2 (en) | 2017-05-30 | 2022-05-17 | Palantir Technologies Inc. | Systems and methods for visually presenting geospatial information |
US20180349862A1 (en) * | 2017-06-04 | 2018-12-06 | Roof Right Now, LLC | Automated Estimate Systems and Methods |
US10297074B2 (en) | 2017-07-18 | 2019-05-21 | Fuscoe Engineering, Inc. | Three-dimensional modeling from optical capture |
US10403011B1 (en) | 2017-07-18 | 2019-09-03 | Palantir Technologies Inc. | Passing system with an interactive user interface |
US10460465B2 (en) | 2017-08-31 | 2019-10-29 | Hover Inc. | Method for generating roof outlines from lateral images |
US10949451B2 (en) * | 2017-09-01 | 2021-03-16 | Jonathan Giuffrida | System and method for managing and retrieving disparate geographically coded data in a database |
WO2019094939A1 (en) | 2017-11-13 | 2019-05-16 | Geomni, Inc. | Systems and methods for rapidly developing annotated computer models of structures |
US11145116B2 (en) | 2017-11-21 | 2021-10-12 | Faro Technologies, Inc. | System and method of scanning an environment and generating two dimensional images of the environment |
US10371537B1 (en) | 2017-11-29 | 2019-08-06 | Palantir Technologies Inc. | Systems and methods for flexible route planning |
US11599706B1 (en) | 2017-12-06 | 2023-03-07 | Palantir Technologies Inc. | Systems and methods for providing a view of geospatial information |
US10698756B1 (en) | 2017-12-15 | 2020-06-30 | Palantir Technologies Inc. | Linking related events for various devices and services in computer log files on a centralized server |
US10616412B1 (en) | 2017-12-20 | 2020-04-07 | Geo-Comm, Inc. | Processing an emergency services call |
US10733470B2 (en) * | 2018-01-25 | 2020-08-04 | Geomni, Inc. | Systems and methods for rapid alignment of digital imagery datasets to models of structures |
US10546419B2 (en) * | 2018-02-14 | 2020-01-28 | Faro Technologies, Inc. | System and method of on-site documentation enhancement through augmented reality |
US10896234B2 (en) | 2018-03-29 | 2021-01-19 | Palantir Technologies Inc. | Interactive geographical map |
US10830599B2 (en) | 2018-04-03 | 2020-11-10 | Palantir Technologies Inc. | Systems and methods for alternative projections of geographical information |
US11585672B1 (en) | 2018-04-11 | 2023-02-21 | Palantir Technologies Inc. | Three-dimensional representations of routes |
CN108665536B (en) * | 2018-05-14 | 2021-07-09 | 广州市城市规划勘测设计研究院 | Three-dimensional and live-action data visualization method and device and computer readable storage medium |
US10429197B1 (en) | 2018-05-29 | 2019-10-01 | Palantir Technologies Inc. | Terrain analysis for automatic route determination |
CN108921045B (en) * | 2018-06-11 | 2021-08-03 | 佛山科学技术学院 | Spatial feature extraction and matching method and device of three-dimensional model |
EP3807844A4 (en) | 2018-06-15 | 2022-04-27 | Geomni, Inc. | Computer vision systems and methods for modeling roofs of structures using two-dimensional and partial three-dimensional data |
US11257297B1 (en) * | 2018-06-15 | 2022-02-22 | Baru Inc. | System, method, and computer program product for manufacturing a customized product |
US11017548B2 (en) * | 2018-06-21 | 2021-05-25 | Hand Held Products, Inc. | Methods, systems, and apparatuses for computing dimensions of an object using range images |
CN109284520B (en) * | 2018-07-10 | 2022-11-01 | 广东工业大学 | DWG (discrete wavelet transform) architectural drawing outer wall rapid extraction method |
US11182954B2 (en) * | 2018-09-07 | 2021-11-23 | Hivemapper Inc. | Generating three-dimensional geo-registered maps from image data |
US10467435B1 (en) | 2018-10-24 | 2019-11-05 | Palantir Technologies Inc. | Approaches for managing restrictions for middleware applications |
US11025672B2 (en) | 2018-10-25 | 2021-06-01 | Palantir Technologies Inc. | Approaches for securing middleware data access |
US10955256B2 (en) * | 2018-10-26 | 2021-03-23 | Here Global B.V. | Mapping system and method for applying texture to visual representations of buildings |
CA3118897A1 (en) | 2018-11-07 | 2020-05-14 | Andrey Konchenko | Data structures for augmented reality planning of geographic locations |
CN109544683B (en) * | 2018-11-07 | 2023-01-10 | 北京科技大学 | Urban building group seismic response dynamic visualization method based on oblique photography data |
CN112041892A (en) * | 2019-04-03 | 2020-12-04 | 南京泊路吉科技有限公司 | Panoramic image-based ortho image generation method |
CN110211245A (en) * | 2019-06-13 | 2019-09-06 | 威创集团股份有限公司 | Show the method and device for the personnel that deploy to ensure effective monitoring and control of illegal activities |
US20220259823A1 (en) * | 2019-06-18 | 2022-08-18 | Nec Corporation | Excavation system, work system, control device, control method, and non-transitory computer-readable medium storing a program |
CN110276757A (en) * | 2019-06-25 | 2019-09-24 | 北京林业大学 | One kind carrying out high canopy density artificial forest region single tree biomass draughtsmanship based on tilted photograph |
CN110260876A (en) * | 2019-06-27 | 2019-09-20 | 厦门建研建筑产业研究有限公司 | A kind of road model generation method and system based on oblique photograph and GIS technology |
JP7156542B2 (en) * | 2019-08-19 | 2022-10-19 | 日本電信電話株式会社 | DETECTION DEVICE, DETECTION METHOD, AND DETECTION PROGRAM FOR LINEAR STRUCTURE |
WO2021096843A1 (en) | 2019-11-11 | 2021-05-20 | Hover Inc. | Systems and methods for selective image compositing |
CN110992469B (en) * | 2019-11-29 | 2024-01-23 | 四川航天神坤科技有限公司 | Visualization method and system for massive three-dimensional model data |
CN111028152B (en) * | 2019-12-02 | 2023-05-05 | 哈尔滨工程大学 | Super-resolution reconstruction method of sonar image based on terrain matching |
CN111222586B (en) * | 2020-04-20 | 2020-09-18 | 广州都市圈网络科技有限公司 | Inclined image matching method and device based on three-dimensional inclined model visual angle |
WO2022011191A1 (en) * | 2020-07-09 | 2022-01-13 | Tektronix, Inc. | Indicating a probing target for a fabricated electronic circuit |
US11847739B2 (en) | 2020-08-26 | 2023-12-19 | Hover Inc. | Systems and methods for pitch determination |
CN112132969B (en) * | 2020-09-01 | 2023-10-10 | 济南市房产测绘研究院(济南市房屋安全检测鉴定中心) | Vehicle-mounted laser point cloud building target classification method |
US11727635B2 (en) * | 2020-10-22 | 2023-08-15 | Faro Technologies, Inc. | Hybrid photogrammetry |
CN112200906B (en) * | 2020-10-23 | 2021-05-07 | 江苏省测绘研究所 | Entity extraction method and system for inclined three-dimensional model |
US11094135B1 (en) | 2021-03-05 | 2021-08-17 | Flyreel, Inc. | Automated measurement of interior spaces through guided modeling of dimensions |
AU2022246173A1 (en) | 2021-03-25 | 2023-09-28 | Insurance Services Office, Inc. | Computer vision systems and methods for generating building models using three-dimensional sensing and augmented reality techniques |
CN113256813B (en) * | 2021-07-01 | 2021-09-17 | 西南石油大学 | Constrained building facade orthophoto map extraction method |
CA3221270A1 (en) * | 2021-07-08 | 2023-01-12 | Matthew Thomas | Methods, storage media, and systems for augmenting data or models |
KR20230135431A (en) * | 2022-03-16 | 2023-09-25 | 에스에이피엔디에이 주식회사 | Automatic Design Method of Space Layout for Multiple Objects, and Medium Being Recorded with Program for Executing the Method |
WO2024044346A1 (en) * | 2022-08-25 | 2024-02-29 | Insurance Services Office, Inc. | Computer vision systems and methods for generating building models using three-dimensional sensing and augmented reality techniques |
Family Cites Families (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4798028A (en) | 1987-11-30 | 1989-01-17 | Pinion John A | Downspout trap and clean out |
JPH07294215A (en) * | 1994-04-25 | 1995-11-10 | Canon Inc | Method and apparatus for processing image |
JP3869876B2 (en) * | 1995-12-19 | 2007-01-17 | キヤノン株式会社 | Image measuring method and image measuring apparatus |
US5973697A (en) | 1997-01-27 | 1999-10-26 | International Business Machines Corporation | Method and system for providing preferred face views of objects in a three-dimensional (3D) environment in a display in a computer system |
US6597818B2 (en) * | 1997-05-09 | 2003-07-22 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration of imagery |
US6587601B1 (en) * | 1999-06-29 | 2003-07-01 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration using a Euclidean representation |
US6980690B1 (en) * | 2000-01-20 | 2005-12-27 | Canon Kabushiki Kaisha | Image processing apparatus |
US7148898B1 (en) | 2000-03-29 | 2006-12-12 | Sourceprose Corporation | System and method for synchronizing raster and vector map images |
JP4094219B2 (en) * | 2000-09-19 | 2008-06-04 | アルパイン株式会社 | 3D map display method for in-vehicle navigation system |
US8130242B2 (en) * | 2000-11-06 | 2012-03-06 | Nant Holdings Ip, Llc | Interactivity via mobile image recognition |
JP3823724B2 (en) | 2000-12-14 | 2006-09-20 | 日本電気株式会社 | Three-dimensional aerial sightseeing improvement server, method thereof, and recording medium |
US7509241B2 (en) * | 2001-07-06 | 2009-03-24 | Sarnoff Corporation | Method and apparatus for automatically generating a site model |
US7194148B2 (en) * | 2001-09-07 | 2007-03-20 | Yavitz Edward Q | Technique for providing simulated vision |
GB2383245B (en) * | 2001-11-05 | 2005-05-18 | Canon Europa Nv | Image processing apparatus |
DE10211293A1 (en) * | 2002-03-14 | 2003-09-25 | Basf Ag | Process for automated surface control and surface correction |
US7199793B2 (en) * | 2002-05-21 | 2007-04-03 | Mok3, Inc. | Image-based modeling and photo editing |
JP4147059B2 (en) * | 2002-07-03 | 2008-09-10 | 株式会社トプコン | Calibration data measuring device, measuring method and measuring program, computer-readable recording medium, and image data processing device |
WO2004042662A1 (en) * | 2002-10-15 | 2004-05-21 | University Of Southern California | Augmented virtual environments |
US20040196282A1 (en) | 2003-02-14 | 2004-10-07 | Oh Byong Mok | Modeling and editing image panoramas |
JP3944738B2 (en) * | 2003-03-18 | 2007-07-18 | ソニー株式会社 | Image processing apparatus and method, recording medium, and program |
US7814436B2 (en) | 2003-07-28 | 2010-10-12 | Autodesk, Inc. | 3D scene orientation indicator system with scene orientation change capability |
US20050267657A1 (en) * | 2004-05-04 | 2005-12-01 | Devdhar Prashant P | Method for vehicle classification |
US7813902B2 (en) | 2004-07-30 | 2010-10-12 | Dean Onchuck | Dormer calculator |
US7728833B2 (en) * | 2004-08-18 | 2010-06-01 | Sarnoff Corporation | Method for generating a three-dimensional model of a roof structure |
EP1855263B1 (en) | 2005-03-02 | 2012-08-15 | Navitime Japan Co., Ltd. | Map display device |
WO2007027847A2 (en) * | 2005-09-01 | 2007-03-08 | Geosim Systems Ltd. | System and method for cost-effective, high-fidelity 3d-modeling of large-scale urban environments |
US8098899B2 (en) | 2005-11-14 | 2012-01-17 | Fujifilm Corporation | Landmark search system for digital camera, map data, and method of sorting image data |
US8160400B2 (en) * | 2005-11-17 | 2012-04-17 | Microsoft Corporation | Navigating images using image based geometric alignment and object based controls |
US8942483B2 (en) | 2009-09-14 | 2015-01-27 | Trimble Navigation Limited | Image-based georeferencing |
US7528938B2 (en) * | 2006-01-10 | 2009-05-05 | Harris Corporation | Geospatial image change detecting system and associated methods |
WO2007087485A2 (en) | 2006-01-13 | 2007-08-02 | Digicontractor Corporation | Method and apparatus for photographic measurement |
US7778491B2 (en) | 2006-04-10 | 2010-08-17 | Microsoft Corporation | Oblique image stitching |
DE102006028086A1 (en) | 2006-06-19 | 2007-12-20 | Jochen Hummel | Method of creating a three-dimensional computer model of a city |
US20080089577A1 (en) * | 2006-10-11 | 2008-04-17 | Younian Wang | Feature extraction from stereo imagery |
US20080112610A1 (en) * | 2006-11-14 | 2008-05-15 | S2, Inc. | System and method for 3d model generation |
US8538166B2 (en) | 2006-11-21 | 2013-09-17 | Mantisvision Ltd. | 3D geometric modeling and 3D video content creation |
US8253731B2 (en) * | 2006-11-27 | 2012-08-28 | Designin Corporation | Systems, methods, and computer program products for home and landscape design |
US8078436B2 (en) * | 2007-04-17 | 2011-12-13 | Eagle View Technologies, Inc. | Aerial roof estimation systems and methods |
US20090000031A1 (en) | 2007-06-29 | 2009-01-01 | Steve Feher | Multiple convective cushion seating and sleeping systems and methods |
US7978937B2 (en) * | 2007-07-02 | 2011-07-12 | International Business Machines Corporation | Using photographic images as a search attribute |
EP2183724B1 (en) * | 2007-07-27 | 2019-06-26 | Esri R&D Center Zurich AG | Computer system and method for generating a 3d geometric model |
EP2179600B1 (en) | 2007-08-06 | 2015-07-01 | TRX Systems, Inc. | Locating, tracking, and/or monitoring people and/or assets both indoors and outdoors |
EP2058765A1 (en) * | 2007-11-07 | 2009-05-13 | 3D Geo GmbH | Method and device for texturizing an object of a virtual three-dimensional geometrical model |
US8531472B2 (en) * | 2007-12-03 | 2013-09-10 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
US8531449B2 (en) * | 2007-12-18 | 2013-09-10 | Navteq B.V. | System and method for producing multi-angle views of an object-of-interest from images in an image dataset |
US8350850B2 (en) | 2008-03-31 | 2013-01-08 | Microsoft Corporation | Using photo collections for three dimensional modeling |
US8224097B2 (en) * | 2008-06-12 | 2012-07-17 | Sri International | Building segmentation for densely built urban regions using aerial LIDAR data |
EP2157545A1 (en) | 2008-08-19 | 2010-02-24 | Sony Computer Entertainment Europe Limited | Entertainment device, system and method |
US8170840B2 (en) | 2008-10-31 | 2012-05-01 | Eagle View Technologies, Inc. | Pitch determination systems and methods for aerial roof estimation |
US8209152B2 (en) | 2008-10-31 | 2012-06-26 | Eagleview Technologies, Inc. | Concurrent display systems and methods for aerial roof estimation |
US8422825B1 (en) * | 2008-11-05 | 2013-04-16 | Hover Inc. | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US8139111B2 (en) | 2008-12-04 | 2012-03-20 | The Boeing Company | Height measurement in a perspective image |
US9098926B2 (en) * | 2009-02-06 | 2015-08-04 | The Hong Kong University Of Science And Technology | Generating three-dimensional façade models from images |
US8390617B1 (en) | 2009-06-04 | 2013-03-05 | Google Inc. | Visualizing oblique images |
US8473852B2 (en) | 2009-07-31 | 2013-06-25 | Siemens Corporation | Virtual world building operations center |
CA2686991A1 (en) * | 2009-12-03 | 2011-06-03 | Ibm Canada Limited - Ibm Canada Limitee | Rescaling an avatar for interoperability in 3d virtual world environments |
WO2011079241A1 (en) | 2009-12-23 | 2011-06-30 | Tomtom International Bv | Method of generating building facade data for a geospatial database for a mobile device |
US8345930B2 (en) * | 2010-01-22 | 2013-01-01 | Sri International | Method for computing food volume in a method for analyzing food |
US9129432B2 (en) * | 2010-01-28 | 2015-09-08 | The Hong Kong University Of Science And Technology | Image-based procedural remodeling of buildings |
WO2011091552A1 (en) | 2010-02-01 | 2011-08-04 | Intel Corporation | Extracting and mapping three dimensional features from geo-referenced images |
WO2011153624A2 (en) * | 2010-06-11 | 2011-12-15 | Ambercore Software Inc. | System and method for manipulating data having spatial coordinates |
US8466915B1 (en) * | 2010-06-15 | 2013-06-18 | Google Inc. | Fusion of ground-based facade models with 3D building models |
WO2012037157A2 (en) * | 2010-09-13 | 2012-03-22 | Alt Software (Us) Llc | System and method for displaying data having spatial coordinates |
US9727834B2 (en) | 2011-06-08 | 2017-08-08 | Jerome Reyes | Remote measurement via on-site portable platform |
US8890863B1 (en) * | 2011-08-12 | 2014-11-18 | Google Inc. | Automatic method for photo texturing geolocated 3-D models from geolocated imagery |
JP2015507860A (en) | 2011-12-07 | 2015-03-12 | インテル コーポレイション | Guide to image capture |
KR101873525B1 (en) * | 2011-12-08 | 2018-07-03 | 삼성전자 주식회사 | Device and method for displaying a contents in wireless terminal |
US9118905B2 (en) * | 2011-12-30 | 2015-08-25 | Google Inc. | Multiplane panoramas of long scenes |
US9933257B2 (en) | 2012-02-03 | 2018-04-03 | Eagle View Technologies, Inc. | Systems and methods for estimation of building wall area |
US9153061B2 (en) * | 2012-05-04 | 2015-10-06 | Qualcomm Incorporated | Segmentation of 3D point clouds for dense 3D modeling |
WO2014055953A2 (en) * | 2012-10-05 | 2014-04-10 | Eagle View Technologies, Inc. | Systems and methods for relating images to each other by determining transforms without using image acquisition metadata |
US9159164B2 (en) | 2013-01-31 | 2015-10-13 | Eagle View Technologies, Inc. | Statistical point pattern matching technique |
US9530225B1 (en) * | 2013-03-11 | 2016-12-27 | Exelis, Inc. | Point cloud data processing for scalable compression |
US9292747B2 (en) * | 2013-03-15 | 2016-03-22 | The Boeing Company | Methods and systems for automatic and semi-automatic geometric and geographic feature extraction |
US8872818B2 (en) * | 2013-03-15 | 2014-10-28 | State Farm Mutual Automobile Insurance Company | Methods and systems for capturing the condition of a physical structure |
WO2015116962A1 (en) | 2014-01-31 | 2015-08-06 | Hover Inc. | Scale error correction in a geo-referenced multi-dimensional model |
US9436987B2 (en) | 2014-04-30 | 2016-09-06 | Seiko Epson Corporation | Geodesic distance based primitive segmentation and fitting for 3D modeling of non-rigid objects from 2D images |
US9934608B2 (en) | 2015-05-29 | 2018-04-03 | Hover Inc. | Graphical overlay guide for interface |
DE102016215840A1 (en) * | 2016-08-23 | 2018-03-01 | Volkswagen Aktiengesellschaft | Method for detecting curbs in the vehicle environment |
-
2008
- 2008-11-05 US US12/265,656 patent/US8422825B1/en active Active
-
2013
- 2013-04-08 US US13/858,707 patent/US8649632B2/en active Active
-
2014
- 2014-01-27 US US14/164,508 patent/US9430871B2/en active Active
- 2014-01-30 US US14/168,149 patent/US20140320485A1/en not_active Abandoned
- 2014-07-23 US US14/339,127 patent/US9437033B2/en active Active
-
2016
- 2016-09-02 US US15/255,952 patent/US20160371882A1/en not_active Abandoned
- 2016-09-02 US US15/255,807 patent/US10776999B2/en active Active
-
2019
- 2019-08-19 US US16/544,327 patent/US10643380B2/en active Active
-
2020
- 2020-03-27 US US16/832,403 patent/US10769847B2/en active Active
- 2020-08-11 US US16/990,453 patent/US11113877B2/en active Active
-
2021
- 2021-08-06 US US17/396,255 patent/US11741667B2/en active Active
-
2022
- 2022-05-26 US US17/826,067 patent/US11574441B2/en active Active
- 2022-05-26 US US17/826,085 patent/US11574442B2/en active Active
-
2023
- 2023-07-12 US US18/351,350 patent/US20230360326A1/en active Pending
- 2023-07-12 US US18/351,378 patent/US20230377261A1/en active Pending
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180182168A1 (en) * | 2015-09-02 | 2018-06-28 | Thomson Licensing | Method, apparatus and system for facilitating navigation in an extended scene |
US11699266B2 (en) * | 2015-09-02 | 2023-07-11 | Interdigital Ce Patent Holdings, Sas | Method, apparatus and system for facilitating navigation in an extended scene |
US20180033198A1 (en) * | 2016-07-29 | 2018-02-01 | Microsoft Technology Licensing, Llc | Forward direction determination for augmented reality and virtual reality |
US10878138B2 (en) | 2017-02-23 | 2020-12-29 | Mitek Holdings, Inc. | Method of managing proxy objects |
US11314903B2 (en) | 2017-02-23 | 2022-04-26 | Mitek Holdings, Inc. | Method of managing proxy objects |
US11687684B2 (en) | 2017-02-23 | 2023-06-27 | Mitek Holdings, Inc. | Method of managing proxy objects |
US20180268614A1 (en) * | 2017-03-16 | 2018-09-20 | General Electric Company | Systems and methods for aligning pmi object on a model |
US11514644B2 (en) | 2018-01-19 | 2022-11-29 | Enphase Energy, Inc. | Automated roof surface measurement from combined aerial LiDAR data and imagery |
Also Published As
Publication number | Publication date |
---|---|
US11113877B2 (en) | 2021-09-07 |
US20230360326A1 (en) | 2023-11-09 |
US9430871B2 (en) | 2016-08-30 |
US20220284674A1 (en) | 2022-09-08 |
US20200372708A1 (en) | 2020-11-26 |
US20140139523A1 (en) | 2014-05-22 |
US10643380B2 (en) | 2020-05-05 |
US10769847B2 (en) | 2020-09-08 |
US20130222375A1 (en) | 2013-08-29 |
US8422825B1 (en) | 2013-04-16 |
US20200226826A1 (en) | 2020-07-16 |
US20160371875A1 (en) | 2016-12-22 |
US10776999B2 (en) | 2020-09-15 |
US20140320485A1 (en) | 2014-10-30 |
US20190378328A1 (en) | 2019-12-12 |
US8649632B2 (en) | 2014-02-11 |
US20220284673A1 (en) | 2022-09-08 |
US20230377261A1 (en) | 2023-11-23 |
US20150029182A1 (en) | 2015-01-29 |
US9437033B2 (en) | 2016-09-06 |
US11741667B2 (en) | 2023-08-29 |
US11574441B2 (en) | 2023-02-07 |
US11574442B2 (en) | 2023-02-07 |
US20210366187A1 (en) | 2021-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160371882A1 (en) | Method and system for displaying and navigating an optimal multi-dimensional building model | |
US11783543B2 (en) | Method and system for displaying and navigating an optimal multi-dimensional building model | |
US9437044B2 (en) | Method and system for displaying and navigating building facades in a three-dimensional mapping system | |
US10030990B2 (en) | Alternate viewpoint image enhancement | |
US9256983B2 (en) | On demand image overlay | |
US9330504B2 (en) | 3D building model construction tools | |
KR101319805B1 (en) | Photographing big things | |
US9047688B2 (en) | Depth cursor and depth measurement in images | |
CN104183016B (en) | A kind of construction method of quick 2.5 dimension building model | |
US8903645B2 (en) | System and apparatus for processing information, image display apparatus, control method and computer program | |
US10878599B2 (en) | Soft-occlusion for computer graphics rendering | |
CA3177646A1 (en) | Spatial processing for map geometry simplification | |
CN112105892A (en) | Identifying map features using motion data and bin data | |
JP7420815B2 (en) | System and method for selecting complementary images from a plurality of images for 3D geometric extraction | |
US10275939B2 (en) | Determining two-dimensional images using three-dimensional models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HOVER INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EGE, CHRISTOPHER DAVID;PAVLIDIS, IOANNIS;ALTMAN, ADAM J.;SIGNING DATES FROM 20140808 TO 20140825;REEL/FRAME:043088/0798 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |