WO2011079241A1 - Method of generating building facade data for a geospatial database for a mobile device - Google Patents

Method of generating building facade data for a geospatial database for a mobile device Download PDF

Info

Publication number
WO2011079241A1
WO2011079241A1 PCT/US2010/061957 US2010061957W WO2011079241A1 WO 2011079241 A1 WO2011079241 A1 WO 2011079241A1 US 2010061957 W US2010061957 W US 2010061957W WO 2011079241 A1 WO2011079241 A1 WO 2011079241A1
Authority
WO
WIPO (PCT)
Prior art keywords
fagade
building
image
category
attribute
Prior art date
Application number
PCT/US2010/061957
Other languages
French (fr)
Inventor
Boris Menkov
Paul De Velder
Tim Wolff
Silvan Stok
Simone Tertoolen
Original Assignee
Tomtom International Bv
Tomtom Belgium Nv
Tomtom North America, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom International Bv, Tomtom Belgium Nv, Tomtom North America, Inc filed Critical Tomtom International Bv
Publication of WO2011079241A1 publication Critical patent/WO2011079241A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Definitions

  • the present invention relates to a method of generating fagade data for a geospatial database, the method being of the type that, for example, accesses image data and associates a fagade image therewith.
  • the present invention also relates to a building fagade information generation system of the type that, for example, accesses image data and associates a fagade image therewith.
  • the present invention further relates to a mobile computing apparatus of the type that, for example, provides a three dimensional representation of a building.
  • the present invention also relates to a method of rendering a fagade of a building, the method being of the type that, for example, represents the building in three dimensions.
  • Portable computing devices for example Portable Navigation Devices (PNDs) that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems.
  • GPS Global Positioning System
  • a modern PND comprises a processor, memory and map data stored within said memory.
  • the processor and memory cooperate to provide an execution environment in which a software operating system is typically established, and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PND to be controlled, and to provide various other functions.
  • these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user.
  • output interfaces include a visual display and a speaker for audible output.
  • input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but can be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech.
  • the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) additionally to provide an input interface by means of which a user can operate the device by touch.
  • Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to and received from the device, and optionally one or more wireless transmitters/receivers to allow communication over cellular telecommunications and other signal and data networks, for example Bluetooth, Wi-Fi, Wi-Max, GSM, UMTS and the like.
  • wireless transmitters/receivers to allow communication over cellular telecommunications and other signal and data networks, for example Bluetooth, Wi-Fi, Wi-Max, GSM, UMTS and the like.
  • PNDs of this type also include a GPS antenna by means of which satellite- broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.
  • the PND may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • PNDs The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.
  • the PND is enabled by software for computing a “best” or “optimum” route between the start and destination address locations from the map data.
  • a “best” or “optimum” route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route.
  • the selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical information about road speeds, and the driver's own preferences for the factors determining road choice (for example the driver may specify that the route should not include motorways or toll roads).
  • the device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions.
  • Real time traffic monitoring systems based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking), are being used to identify traffic delays and to feed the information into notification systems.
  • PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself.
  • the navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant), a media player, a mobile telephone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.
  • a hand-held system such as a PDA (Portable Digital Assistant), a media player, a mobile telephone or the like
  • the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes.
  • the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey.
  • the route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function.
  • a further important function provided by the device is automatic route re- calculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.
  • a route to be calculated with user defined criteria for example, the user may prefer a scenic route to be calculated by the device, or may wish to avoid any roads on which traffic congestion is likely, expected or currently prevailing.
  • the device software would then calculate various routes and weigh more favourably those that include along their route the highest number of points of interest (known as POIs), which are examples of geographic features, tagged as being for example of scenic beauty, or, using stored information indicative of prevailing traffic conditions on particular roads, order the calculated routes in terms of a level of likely congestion or delay on account thereof.
  • POIs points of interest
  • Other POI-based and traffic information-based route calculation and navigation criteria are also possible.
  • route calculation and navigation functions are fundamental to the overall utility of PNDs, it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance.
  • PNDs it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination.
  • PNDs It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in- vehicle navigation.
  • An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information, examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn.
  • the navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated, a simple instruction such as "turn left in 100 m" requires significant processing and analysis.
  • user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method.
  • Devices of the type described above for example the GO 940 LIVE model manufactured and supplied by TomTom International B.V., provide a reliable means for enabling users to navigate from one position to another. Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating.
  • image data is stored in relation to buildings identified in a map database stored and used by the PND, the image data either being stored directly in the map database or separately but in a relational manner with respect to the buildings identified in the map database.
  • One known technique for generating image data and storing the image data in a so-called geocoded map database is described in International patent application no. PCT/EP2005/055317 (published under International publication no. WO 2007/045272).
  • This document describes an example of the texturing and aerial imagery modelling technique in which the height of a building is calculated from image sequence data, which with other data is used to generate a fagade image from the image sequence data, including conversion of a view of the building from an inclined viewpoint to a non- inclined viewpoint.
  • the fagade image is then stored in an enhanced geocoded map database or the geocoded map database is provided with references to fagade images stored in a separate database of fagades.
  • Other implementations are also described in this document, but the implementations rely upon fagade images being stored, either in parts or as a whole for each building identified in the geo-coded map database.
  • the storage capacity required for such fagade images is large and the fagade images are distorted due to the conversion that takes place to modify the viewpoint of the fagade image.
  • the fagade images generated are incorrect for the viewpoint to be presented, for example at a zero elevation angle the user is presented with a translated view of an aerial view of a balcony even though it is the side or underside of the balcony that should be visible to the user.
  • a method of generating fagade data for a geospatial database comprising: collecting image data relating to the fagade of a building; categorising the fagade of the building from the collected image data by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories; providing a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and recording an association between the building and the fagade category for access of the fagade image.
  • the method can be a computer implemented method of generating the fagade data.
  • the method may further comprise: determining a first attribute of the fagade of the building; and recording the first attribute in respect of the building.
  • the method may further comprising: determining a second attribute of the fagade of the building; and recording the second attribute in respect of the building.
  • the selection of the fagade category may be in response to obtaining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
  • the first attribute may be a height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
  • the second attribute may be a colour associated with the fagade of the building or the second attribute may be a height associated with the fagade of the building.
  • the fagade category may comprise at least one sub-category.
  • the fagade image may be a fagade component image; the fagade component image may be a repeatably reproducible image for constructing the fagade image associated with the building.
  • the method may further comprise using an electronic storage device, such as a memory (volatile or non-voltaile) to store the association between the building and the fagade category.
  • an electronic storage device such as a memory (volatile or non-voltaile) to store the association between the building and the fagade category.
  • the plurality of fagade images of the library of fagade images may comprise one or more textures.
  • each fagade image may be stored as a separate texture.
  • a plurality of fagade images may also be stored together in a single texture.
  • a building fagade information generation system comprising: an input arranged to receive image data relating to the fagade of a building; a processing resource arranged to support an analyser module, the analyser module being arranged to categorise the fagade of the building by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories; a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and a geospatial database identifying the building; wherein the analyser module is arranged to update the geospatial database with an association between the building and fagade category for access of the fagade image.
  • the system may further comprise: determining a first attribute of the fagade of the building; and recording the first attribute in respect of the building.
  • the system may further comprise: determining a second attribute of the fagade of the building; and recording the second attribute in respect of the building.
  • the analyser module may be arranged to select the fagade category in response to determining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
  • the first attribute may be a height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
  • the second attribute may be a colour associated with the fagade of the building or the second attribute may be a height associated with the fagade of the building.
  • the fagade category may comprise at least one sub-category.
  • the fagade image may be a fagade component image; the fagade component image may be a repeatably reproducible image for constructing the fagade image associated with the building.
  • the system may further comprise: an electronic storage device arranged to store the association between the building and the fagade category.
  • the plurality of fagade images of the library of fagade images may comprise a texture.
  • a mobile computing apparatus comprising: a processing resource arranged to support, when in use, an operational environment, the operational environment supporting a location determination module and an image generation module; and a map database comprising geospatial data associated with a building and building fagade category data associated with the building; wherein the location determination module is arranged to determine a current location; the image generation module is arranged to obtain the current location from the location determination module and to determine that the building needs to be displayed in three dimensions; the image generation module accesses the map database and uses the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and the image generation module is arranged to generate an image of the building as a three dimensional view using the fagade image.
  • the image generation module may be arranged to obtain a first attribute from the map database, the first attribute being associated with the fagade of the building; and the image generation module may be arranged to adapt the fagade image in accordance with the first attribute.
  • the first attribute may be a height associated with the fagade of the building or the first attribute may be a height associated with the fagade of the building.
  • the image generation module may be arranged to obtain a second attribute from the map database; the second attribute may be associated with the fagade of the building; and the image generation module may be arranged to adapt the fagade image in accordance with the second attribute.
  • the second attribute may be a height associated with the fagade of the building or the second attribute may be a colour associated with the fagade of the building.
  • the plurality of fagade images of the library of fagade images may comprises a texture; and the image generation module may be arranged to map the texture onto a building profile at least as part of the generation of the image of the building.
  • a navigation apparatus comprising the mobile computing apparatus as set forth above in relation to the third aspect of the invention.
  • a method of rendering a fagade of a building for display by a mobile computing apparatus comprising: the mobile computing apparatus determining a current location; an image generation module obtaining the current location and determining that a building needs to be displayed in three dimensions; accessing a map database, the map database comprising geospatial data associated with the building and building fagade category data associated with the building; the image generation module using the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and the image generation module generating an image of the building as a three dimensional view using the fagade image.
  • the image generation module may obtain a first attribute from the map database, the attribute being associated with the fagade of the building; and the image generation module may adapt the fagade image in accordance with the first attribute.
  • the first attribute may be height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
  • the image generation module may obtain a second attribute from the map database; the attribute may be associated with the fagade of the building; and the image generation module may adapt the fagade image in accordance with the second attribute.
  • the second attribute may be a height associated with the fagade of the building or the second attribute may be a colour associated with the fagade of the building.
  • the plurality of fagade images of the library of fagade images may comprise a texture; and the generation of the image of the building may comprise mapping the texture onto a building profile.
  • a computer program element comprising computer program code means to make a computer execute the method as set forth above in relation to the first aspect or the fifth aspect of the invention.
  • the computer program element may be embodied on a computer readable medium.
  • Figure 1 is a schematic diagram of a computing arrangement implementing an embodiment of the invention
  • Figure 2 is a schematic diagram of a stopping zone database system supported by the computing arrangement of Figure 1 ;
  • Figure 3 is a flow diagram of a method of generating fagade data for a geospatial database employed by the computing arrangement of Figure 1 ;
  • FIG. 4 is a schematic illustration of an exemplary part of a Global Positioning System (GPS) usable by a navigation apparatus;
  • GPS Global Positioning System
  • Figure 5 is a schematic diagram of electronic components of a navigation apparatus constituting another embodiment of the invention.
  • Figure 6 is a schematic representation of an architectural stack employed by the navigation apparatus of Figure 5;
  • Figure 7 is a schematic diagram of an image processing module of Figure 6 in greater detail.
  • Figure 8 is a flow diagram of a method of rendering a fagade of a building constituting yet another embodiment of the invention.
  • Example embodiments of the present disclosure may be described with particular reference to a navigation device (ND) or personal navigation device (PND). It should be remembered, however, that the teachings of the present disclosure are not limited to NDs or PNDs, but are instead universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality. It follows, therefore, that in the context of the present application, a navigation device is intended to include (without limitation) any type of route planning and navigation device, irrespective of whether that device is embodied as a PND, a navigation device built into a vehicle, or a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA)) executing route planning and navigation software.
  • PC personal computer
  • PDA portable digital assistant
  • location measurements may be obtained from any source and are not limited to GPS.
  • Wi-Fi access points or cellular communications networks can be used.
  • a computing arrangement 100 comprising a processing resource 102, for example a processor, such as a microprocessor.
  • a processing resource 102 for example a processor, such as a microprocessor.
  • the processor 102 is coupled to a plurality of storage components, including a hard disk drive 104, a Read Only Memory (ROM) 106, a digital memory, for example a flash memory 108, and a Random Access Memory (RAM) 1 10. Not all of the memory types described above need necessarily be provided. Moreover, these storage components need not be located physically close to the processor 102, and can be located remotely from the processor 102.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the processor 102 is also coupled to one or more input devices for inputting instructions and data by a user, for example a keyboard 1 12 and a mouse 1 14.
  • input devices for example a touch screen input unit, a trackball and/or a voice recognition unit, or any other input device, known to persons skilled in the art, can also be provided.
  • a removable media unit 1 16 coupled to the processor 102 is provided.
  • the removable media unit 1 16 is arranged to read data from and possibly write data to a removable data carrier or removable storage medium, for example a Compact Disc- ReWritable (CD-RW) disc 1 18.
  • CD-RW Compact Disc- ReWritable
  • the removable data carriers can be, for example: tapes, DVDs, CD-Rs, DVD-Rs, CD-ROMs, DVD-RAMs, SD Cards or memory sticks as is known to persons skilled in the art.
  • the processor 102 can be coupled to a printer 120 for printing output data on paper, as well as being coupled to a display 122, for instance, a monitor, such as an LCD (Liquid Crystal Display) monitor, or any other type of display known to persons skilled in the art.
  • the processor 102 can also be coupled to a loudspeaker 124.
  • the processor 102 can be coupled to a communications network 126, such as a data network, by means of a data communications interface 128.
  • the processor 102 can therefore be arranged to communicate with other communication-enabled equipment through the network 126.
  • the data carrier 1 18 can comprise a computer program product in the form of data and/or instructions arranged to provide the processor 102 with the capacity to perform a method as described later herein. However, such a computer program product may, alternatively, be downloaded via the communications network 126.
  • the processing resource 102 can be implemented as a standalone system, or as a plurality of parallel operating processors each arranged to carry out sub-tasks of a larger computer program, or as one or more main processors with several sub- processors.
  • the components contained in the computing arrangement 100 of Figure 1 are those typically found in general purpose computer systems, and are intended to represent a broad category of such computer components that are well known in the art.
  • the computing arrangement 100 of Figure 1 can be a Personal Computer (PC), workstation, minicomputer, mainframe computer, etc.
  • the computer can also include different bus configurations, networked platforms, multi-processor platforms, etc.
  • Various operating systems can be used including UNIX, Solaris, Linux, Windows, Macintosh OS, or any other suitable operating system.
  • a geographic database management facility 200 includes a collection facility 202.
  • the collection facility 202 collects geocoded image sequences 106 or any other suitable data from which a fagade of a building, or other geopositional feature, can be determined.
  • Original map data is used in combination with geocoded image sequences in order to generate a geospatial database 206.
  • the original map data 104 is a collection of one or more files making up a map database.
  • the original map data 104 includes geocoded digital 2D city maps, including building footprint information and corresponding geoposition information.
  • the geoposition information in geocoded digital 2D city maps corresponds to the location coordinates of objects, such as XY coordinates or the like.
  • the geocoded image sequences obtained by the collection facility 202 are, in this example, image sequences obtained with a mobile mapping vehicle or the like.
  • the mobile mapping vehicle for example a delivery van or multi-purpose vehicle, has a cluster of image sensors mounted externally.
  • the image sensors can be in the form of cameras such as CCD cameras.
  • At least one pair of the image sensors can be a stereoscopic pair.
  • Information concerning the precise position and orientation of the vehicle is obtained from GPS data and an inertial system.
  • the image sensors provide a number of overlapping images of all features of interest in the vicinity of the vehicle. These images are stored for later processing as will be described later herein. Furthermore, the position of the image sensors with respect to each other is accurately determined and the orientation of the image sensors with respect to the vehicle. This information is digitally stored as camera calibration information in a file.
  • the GPS determines accurately the geoposition of the vehicle. In combination with the camera calibration information, the geoposition of the image sensors is determined.
  • a processor for example a personal computer, combines geopositions with the image sequences, to enable the determination of the exact geopositions of each of the images.
  • the original map data and the geocoded image sequences are stored on a processor readable storage medium, for example the data carrier 1 18.
  • the collection facility 202 receives the original map data and the geocoded image sequences and determines attribute data associated with buildings.
  • the analysis module 204 can determine the height of the building from the extracted images using, for example, triangulation. This is possible, because the distance between the locations of the cameras is known and stored as part of the calibration of the mobile mapping vehicle.
  • the height information is an example of an attribute of the building. Another attribute can be the width of the building or the colour of a fagade of the building.
  • the fagade of a building is selected. This can be achieved in an automated manner, for example using image processing software or manually by a Digital Map Technician (DMT).
  • the image can be selected by means of the geocoded information and the camera calibration information of the image sequences in combination with the geopositions of the boundary for which a detailed facade has to be generated.
  • a library of fagade images 208 is also provided.
  • the library of fagade images comprises a plurality of images of fagades of buildings according to category of building, for example a Georgian fagade, which would have a "Georgian" as its category.
  • the category can, in another embodiment, comprise a sub-category for example to identify a particular style of Georgian fagade.
  • the fagade images are graphically drawn images, for example "cartoons" of fagades, although the skilled person should appreciate that the fagades can be images obtained, for example, using the mobile mapping vehicle mentioned above.
  • the analysis module 204 analyses (Step 302) the selected image in order to identify the fagade of the building. If a fagade is not identified (Step 304) from the image of the building extracted, the analysis module repeats the above steps of extracting images and trying to identify a fagade from the extracted image. However, if a fagade has been identified, the analysis module 204 then attempts to categorise (Step 306) the fagade identified by referencing the library 208 in an automated manner using an image matching technique, or with the assistance of the DMT when manual comparison is required.
  • the original map data is augmented or supplemented (Step 308) with the categorisation data and stored (Step 310) in the geospatial database 206.
  • the analysis module 204 determines (Step 312) whether further image sequences need to be analysed in order to categorise fagades of buildings. In the event that further buildings exist that can be categorised, the above process is repeated. Otherwise, the process terminates.
  • the augmented geospatial database 206 can be used by various devices and/or systems to provide three dimensional (3D) views of buildings. In this respect, the generation of a three dimensional view of a building will now be described in the context of the navigation apparatus 200 being used to assist a user to navigate from a first location to a second location.
  • GPS Global Positioning System
  • the GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users.
  • NAVSTAR the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be, determined with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver uses the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal allows the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system 500 comprises a plurality of satellites 102 orbiting the earth 504.
  • a GPS receiver 506 receives spread spectrum GPS satellite data signals 508 from a number of the plurality of satellites 502.
  • the spread spectrum data signals 508 are continuously transmitted from each satellite 502, the spread spectrum data signals 508 transmitted each comprise a data stream including information identifying a particular satellite 502 from which the data stream originates.
  • the GPS receiver 506 generally requires spread spectrum data signals 508 from at least three satellites 502 in order to be able to calculate a two-dimensional position. Receipt of a fourth spread spectrum data signal enables the GPS receiver 506 to calculate, using a known technique, a three-dimensional position.
  • the block diagram of the navigation apparatus 200 is not inclusive of all components of the navigation apparatus, but is only representative of many example components.
  • the navigation apparatus 200 is located within a housing (not shown).
  • the navigation apparatus 200 includes a processing resource, for example a processor 602, the processor 602 being coupled to an input device 604 and a display device, for example a display screen 606.
  • a processing resource for example a processor 602
  • the input device 604 represents any number of input devices, including a keyboard device, voice input device, touch panel and/or any other known input device utilised to input information.
  • the display screen 606 can include any type of display screen such as a Liquid Crystal Display (LCD), for example.
  • LCD Liquid Crystal Display
  • one aspect of the input device 604, the touch panel, and the display screen 606 are integrated so as to provide an integrated input and display device, including a touchpad or touchscreen input to enable both input of information (via direct input, menu selection, etc.) and display of information through the touch panel screen so that a user need only touch a portion of the display screen 606 to select one of a plurality of display choices or to activate one of a plurality of virtual or "soft" buttons.
  • the processor 602 supports a Graphical User Interface (GUI) that operates in conjunction with the touchscreen.
  • GUI Graphical User Interface
  • the processor 602 is operatively connected to and capable of receiving input information from input device 604 via a connection 610, and operatively connected to at least one of the display screen 606 and an output device 608, via respective output connections 612, to output information thereto.
  • the output device 608 is, for example, an audible output device (e.g. including a loudspeaker).
  • input device 604 can include a microphone and software for receiving input voice commands as well.
  • the navigation apparatus 200 can also include any additional input device 604 and/or any additional output device, such as audio input/output devices.
  • the processor 602 is operably coupled to a memory resource 614 via connection 616 and is further adapted to receive/send information from/to input/output (I/O) ports 618 via connection 620, wherein the I/O port 618 is connectible to an I/O device 622 external to the navigation apparatus 200.
  • the external I/O device 622 may include, but is not limited to an external listening device, such as an earpiece for example.
  • the connection to I/O device 622 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an earpiece or headphones.
  • the memory resource 614 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non- volatile memory, for example a digital memory, such as a flash memory.
  • RAM Random Access Memory
  • non- volatile memory for example a digital memory, such as a flash memory.
  • Figure 5 further illustrates an operative connection between the processor 602 and an antenna/receiver 624 via connection 626, wherein the antenna/receiver 624 can be a GPS antenna/receiver for example.
  • the antenna and receiver designated by reference numeral 624 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna can be a GPS patch antenna or helical antenna for example.
  • the processor 602 is also coupled to a cellular communications module 628 constituting the mobile telephone technology.
  • the cellular communications module 628 supports a communications interface 629 for transmitting and receiving data wirelessly.
  • the cellular communications module 628 comprises a Subscriber Identity Module (SIM) (not shown) coupled thereto having a data subscription associated therewith.
  • SIM Subscriber Identity Module
  • the subscription is, in this example, for a limited data usage over a predetermined period of time, for example a calendar month. In other embodiments, the subscription need not have a data usage limit.
  • the cellular communications module 628 supports a bidirectional data communications service, for example a packet switched data service, such as a General Packet Radio Service (GPRS) supported by the GSM communications network and/or a High Speed Downlink Packet Access (HSDPA) service supported by the UMTS network.
  • GPRS General Packet Radio Service
  • HSDPA High Speed Downlink Packet Access
  • the communications interface 629 is therefore compatible with the bidirectional data communications service.
  • the bidirectional data communications service supports an Internet Protocol (IP) for data communications although use of other protocols, additionally or alternatively, is contemplated.
  • IP Internet Protocol
  • the navigation apparatus 200 comprises the cellular communications module 628.
  • a data session can be established, if required, with the communications network via a separate wireless communications terminal (not shown), such as a mobile telephone, PDA, and/or any device with mobile telephone technology, in order to establish a digital connection, for example a digital connection via known Bluetooth technology.
  • the navigation apparatus 200 can be Bluetooth enabled in order that the navigation apparatus 200 can be agnostic to the settings of the wireless communications terminal, thereby enabling the navigation apparatus 200 to operate correctly with the ever changing range of mobile telephone models, manufacturers, etc. Model/manufacturer specific settings can, for example, be stored by the navigation apparatus 200, if desired. The data stored for this information can be updated.
  • the electronic components shown in Figure 6 are powered by one or more power sources (not shown) in a conventional manner.
  • the components shown in Figure 5 can be in communication with one another via wired and/or wireless connections and the like.
  • the navigation apparatus 200 described herein can be a portable or handheld navigation apparatus.
  • the memory resource 614 of the navigation apparatus 200 stores a boot loader program (not shown) that is executed by the processor 602 in order to load an operating system 638 from the memory resource 614 for execution by functional hardware components 636, which provides an environment in which application software 640 can run.
  • the operating system 638 serves to control the functional hardware components 636 and resides between the application software 640 and the functional hardware components 636.
  • the application software 640 provides an operational environment including the GUI that supports core functions of the navigation apparatus 200, for example map viewing, route planning, navigation functions and any other functions associated therewith. In this example, in order to render images of buildings in 3D, the application software 640 supports an image generation module 642.
  • the image generation 642 comprises an image generation engine 650.
  • the image generation engine 650 is capable of communicating with a location determination module 652 that is also supported by the application software.
  • the location determination module 652 uses data obtained from the GPS receiver in order to determine the location of the navigation apparatus 200, which is an example of a mobile computing apparatus.
  • the image generation engine 650 is also operably coupled to the display device 606.
  • the image generation engine 650 is also capable of accessing a map database 654, the map database having been derived from the geospatial database 206 mentioned above.
  • the map database 654 identifies buildings and also comprises category data associated with the respective fagades of the buildings and respective attribute data associated with the fagades of the buildings.
  • Fagade images associated with the categories of fagades are provided in a local library of fagade images 656.
  • the library of fagade images 656 can be stored remotely and retrieved via a communications network.
  • the image generation engine 650 obtains (Step 800) a current location of the navigation apparatus 200 from the location determination module 652.
  • the image generation engine 650 accesses (Step 802) the map database 654 and obtains geospatial data required to generate images of geospatial features in the vicinity of the current location.
  • the category of the fagade and any attributes of the fagade are obtained from the map database 654.
  • the image generation engine then accesses (Step 804) the library of fagade images 656 and retries a graphical representation of the fagade associated with the category retrieved from the map database 654 in respect of the building.
  • the image generation engine uses, the attributes and the image retrieved from the library 656 in order to generate (Step 806) a 3D representation of the building.
  • 3D representations of buildings can be made with varying levels of detail.
  • a first, aesthetically basic, level of detail is a so-called generic block model or building "profile”.
  • the "profile” can be generated using any technique known in the art, e.g. as described in WO 2007/045272.
  • the profile can be generated from the 2D footprint of a building (e.g. which are commonly stored with digital maps).
  • texture can be assigned to the blocks.
  • the library 656 can comprise textures constituting fagade images (and hence can have respective categories associated therewith) and a texture mapping technique can be employed to map the texture onto the building profile.
  • the library of fagade images comprises images of whole fagades
  • the fagade image stored can be a fagade component image that is a repeatably reproducible image for constructing the fagade image associated with the building.
  • the navigation apparatus 200 is being used to generate a representation of one or more buildings without providing navigation assistance, for example when a user engages in the so-called "free driving" mentioned above.
  • the representation of build fagades as described above can be employed in the context of the navigation apparatus 200 providing navigation assistance between a first specified location and a second specified location.
  • the embodiments described herein are not exclusively applicable to navigation apparatus and can be employed in relation to other portable computing apparatus for which it is desirable to display buildings in three dimensions. Indeed, implementations are not limited to portable computing apparatus and the embodiments described herein can be applied to client- server topologies.
  • the navigation apparatus may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) the GPS.
  • the navigation apparatus may utilise other global navigation satellite systems (GNSS) such as the proposed European Galileo system when available.
  • GNSS global navigation satellite systems
  • LORAN long range navigation
  • Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared.
  • the series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)

Abstract

A method of generating façade data for a geospatial database comprises collecting image data relating to the façade of a building. The façade of the building is then categorised from the collected image data in order to generate a façade category associated with the building. A library of façade images is provided, the library comprising a façade image for the building, and an association between the building and the façade category recorded for access of the façade image.

Description

METHOD OF GENERATING BUILDING FACADE DATA FOR A GEOSPATIAL DATABASE FOR A MOBILE DEVICE
Field of the Invention
The present invention relates to a method of generating fagade data for a geospatial database, the method being of the type that, for example, accesses image data and associates a fagade image therewith. The present invention also relates to a building fagade information generation system of the type that, for example, accesses image data and associates a fagade image therewith. The present invention further relates to a mobile computing apparatus of the type that, for example, provides a three dimensional representation of a building. The present invention also relates to a method of rendering a fagade of a building, the method being of the type that, for example, represents the building in three dimensions.
Background to the Invention
Portable computing devices, for example Portable Navigation Devices (PNDs) that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems.
In general terms, a modern PND comprises a processor, memory and map data stored within said memory. The processor and memory cooperate to provide an execution environment in which a software operating system is typically established, and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PND to be controlled, and to provide various other functions.
Typically, these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user. Illustrative examples of output interfaces include a visual display and a speaker for audible output. Illustrative examples of input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but can be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech. In one particular arrangement, the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) additionally to provide an input interface by means of which a user can operate the device by touch.
Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to and received from the device, and optionally one or more wireless transmitters/receivers to allow communication over cellular telecommunications and other signal and data networks, for example Bluetooth, Wi-Fi, Wi-Max, GSM, UMTS and the like.
PNDs of this type also include a GPS antenna by means of which satellite- broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.
The PND may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted. Typically, such features are most commonly provided in in-vehicle navigation systems, but may also be provided in PNDs if it is expedient to do so.
The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.
Typically, the PND is enabled by software for computing a "best" or "optimum" route between the start and destination address locations from the map data. A "best" or "optimum" route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route. The selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical information about road speeds, and the driver's own preferences for the factors determining road choice (for example the driver may specify that the route should not include motorways or toll roads).
The device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions. Real time traffic monitoring systems, based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking), are being used to identify traffic delays and to feed the information into notification systems. PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself. The navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant), a media player, a mobile telephone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.
Once a route has been calculated by a PND, the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes. Optionally, the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey. The route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function.
A further important function provided by the device is automatic route re- calculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.
As mentioned above, it is also known to allow a route to be calculated with user defined criteria; for example, the user may prefer a scenic route to be calculated by the device, or may wish to avoid any roads on which traffic congestion is likely, expected or currently prevailing. The device software would then calculate various routes and weigh more favourably those that include along their route the highest number of points of interest (known as POIs), which are examples of geographic features, tagged as being for example of scenic beauty, or, using stored information indicative of prevailing traffic conditions on particular roads, order the calculated routes in terms of a level of likely congestion or delay on account thereof. Other POI-based and traffic information-based route calculation and navigation criteria are also possible.
Although the route calculation and navigation functions are fundamental to the overall utility of PNDs, it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance. During navigation along a calculated route, it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination. It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in- vehicle navigation.
In this respect, there is an increasing desire to display map information in three dimensions (3D). Representation of an otherwise unfamiliar environment can help a user of the PND to locate a destination or a current location more easily than if working from map information displayed in two dimensions (2D). In a city, two kinds of 3D city modelling for navigation are known: block modelling and textured with aerial imagery.
An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information, examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn. The navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated, a simple instruction such as "turn left in 100 m" requires significant processing and analysis. As previously mentioned, user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method.
Devices of the type described above, for example the GO 940 LIVE model manufactured and supplied by TomTom International B.V., provide a reliable means for enabling users to navigate from one position to another. Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating.
However, in relation to PNDs that rely upon 3D map information generated using the texturing and aerial imagery modelling, image data is stored in relation to buildings identified in a map database stored and used by the PND, the image data either being stored directly in the map database or separately but in a relational manner with respect to the buildings identified in the map database. One known technique for generating image data and storing the image data in a so-called geocoded map database is described in International patent application no. PCT/EP2005/055317 (published under International publication no. WO 2007/045272). This document describes an example of the texturing and aerial imagery modelling technique in which the height of a building is calculated from image sequence data, which with other data is used to generate a fagade image from the image sequence data, including conversion of a view of the building from an inclined viewpoint to a non- inclined viewpoint. The fagade image is then stored in an enhanced geocoded map database or the geocoded map database is provided with references to fagade images stored in a separate database of fagades. Other implementations are also described in this document, but the implementations rely upon fagade images being stored, either in parts or as a whole for each building identified in the geo-coded map database.
However, the storage capacity required for such fagade images is large and the fagade images are distorted due to the conversion that takes place to modify the viewpoint of the fagade image. Furthermore, where aerial views of the buildings are used, the fagade images generated are incorrect for the viewpoint to be presented, for example at a zero elevation angle the user is presented with a translated view of an aerial view of a balcony even though it is the side or underside of the balcony that should be visible to the user.
One solution to this problem is to create a cartoon-like or schematic view of the side of each building, known as a "texture" of the building. The schematic view of the building does not contain distortions and other artefacts typical of a fagade image generated from an aerial image or an image from a non-zero viewing angle. However, this solution still requires a large storage capacity as an image is stored for each building identified in a map database.
Summary of the Invention
According to a first aspect of the present invention, there is provided a method of generating fagade data for a geospatial database, the method comprising: collecting image data relating to the fagade of a building; categorising the fagade of the building from the collected image data by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories; providing a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and recording an association between the building and the fagade category for access of the fagade image. The method can be a computer implemented method of generating the fagade data.
The method may further comprise: determining a first attribute of the fagade of the building; and recording the first attribute in respect of the building.
The method may further comprising: determining a second attribute of the fagade of the building; and recording the second attribute in respect of the building.
The selection of the fagade category may be in response to obtaining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
The first attribute may be a height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
The second attribute may be a colour associated with the fagade of the building or the second attribute may be a height associated with the fagade of the building.
The fagade category may comprise at least one sub-category.
The fagade image may be a fagade component image; the fagade component image may be a repeatably reproducible image for constructing the fagade image associated with the building.
The method may further comprise using an electronic storage device, such as a memory (volatile or non-voltaile) to store the association between the building and the fagade category.
The plurality of fagade images of the library of fagade images may comprise one or more textures. For example, each fagade image may be stored as a separate texture. A plurality of fagade images may also be stored together in a single texture.
According to a second aspect of the present invention, there is provided a building fagade information generation system comprising: an input arranged to receive image data relating to the fagade of a building; a processing resource arranged to support an analyser module, the analyser module being arranged to categorise the fagade of the building by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories; a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and a geospatial database identifying the building; wherein the analyser module is arranged to update the geospatial database with an association between the building and fagade category for access of the fagade image.
The system may further comprise: determining a first attribute of the fagade of the building; and recording the first attribute in respect of the building. The system may further comprise: determining a second attribute of the fagade of the building; and recording the second attribute in respect of the building.
The analyser module may be arranged to select the fagade category in response to determining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
The first attribute may be a height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
The second attribute may be a colour associated with the fagade of the building or the second attribute may be a height associated with the fagade of the building.
The fagade category may comprise at least one sub-category.
The fagade image may be a fagade component image; the fagade component image may be a repeatably reproducible image for constructing the fagade image associated with the building.
The system may further comprise: an electronic storage device arranged to store the association between the building and the fagade category.
The plurality of fagade images of the library of fagade images may comprise a texture.
According to a third aspect of the present invention, there is provided a mobile computing apparatus comprising: a processing resource arranged to support, when in use, an operational environment, the operational environment supporting a location determination module and an image generation module; and a map database comprising geospatial data associated with a building and building fagade category data associated with the building; wherein the location determination module is arranged to determine a current location; the image generation module is arranged to obtain the current location from the location determination module and to determine that the building needs to be displayed in three dimensions; the image generation module accesses the map database and uses the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and the image generation module is arranged to generate an image of the building as a three dimensional view using the fagade image.
The image generation module may be arranged to obtain a first attribute from the map database, the first attribute being associated with the fagade of the building; and the image generation module may be arranged to adapt the fagade image in accordance with the first attribute. The first attribute may be a height associated with the fagade of the building or the first attribute may be a height associated with the fagade of the building.
The image generation module may be arranged to obtain a second attribute from the map database; the second attribute may be associated with the fagade of the building; and the image generation module may be arranged to adapt the fagade image in accordance with the second attribute.
The second attribute may be a height associated with the fagade of the building or the second attribute may be a colour associated with the fagade of the building.
The plurality of fagade images of the library of fagade images may comprises a texture; and the image generation module may be arranged to map the texture onto a building profile at least as part of the generation of the image of the building.
According to a fourth aspect of the present invention, there is provided a navigation apparatus comprising the mobile computing apparatus as set forth above in relation to the third aspect of the invention.
According to a fifth aspect of the present invention, there is provided a method of rendering a fagade of a building for display by a mobile computing apparatus, the method comprising: the mobile computing apparatus determining a current location; an image generation module obtaining the current location and determining that a building needs to be displayed in three dimensions; accessing a map database, the map database comprising geospatial data associated with the building and building fagade category data associated with the building; the image generation module using the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and the image generation module generating an image of the building as a three dimensional view using the fagade image.
The image generation module may obtain a first attribute from the map database, the attribute being associated with the fagade of the building; and the image generation module may adapt the fagade image in accordance with the first attribute.
The first attribute may be height associated with the fagade of the building or the first attribute may be a colour associated with the fagade of the building.
The image generation module may obtain a second attribute from the map database; the attribute may be associated with the fagade of the building; and the image generation module may adapt the fagade image in accordance with the second attribute.
The second attribute may be a height associated with the fagade of the building or the second attribute may be a colour associated with the fagade of the building. The plurality of fagade images of the library of fagade images may comprise a texture; and the generation of the image of the building may comprise mapping the texture onto a building profile.
According to a fifth aspect of the present invention, there is provided a computer program element comprising computer program code means to make a computer execute the method as set forth above in relation to the first aspect or the fifth aspect of the invention.
The computer program element may be embodied on a computer readable medium.
It should be appreciated that although the features set forth above or in the appended claims are recited in a certain order, it should be appreciated that the features set forth above or in the appended claims can be used in any suitable combination or individually as appropriate.
It is thus possible to provide a method of generating fagade data and building fagade information generation system that provide output data that enables images of building fagades to be generated whilst minimising image data storage requirements in respect of image data. This reduction in storage requirements is achieved due to the use of categorisation and hence the storage of association between a building and a fagade image using textual data instead of image data. The reduction in data storage requirements enables three dimensional views of buildings to be generated by an increased range of portable computing devices having limited storage capacity. It is also possible to provide a portable computing apparatus and a method of rendering a fagade of a building that support a smaller storage requirement than current requirements.
Other advantages of these embodiments are set out hereafter, and further details and features of each of these embodiments are defined in the accompanying dependent claims and elsewhere in the following detailed description.
Brief Description of the Drawings
At least one embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of a computing arrangement implementing an embodiment of the invention;
Figure 2 is a schematic diagram of a stopping zone database system supported by the computing arrangement of Figure 1 ;
Figure 3 is a flow diagram of a method of generating fagade data for a geospatial database employed by the computing arrangement of Figure 1 ;
Figure 4 is a schematic illustration of an exemplary part of a Global Positioning System (GPS) usable by a navigation apparatus;
Figure 5 is a schematic diagram of electronic components of a navigation apparatus constituting another embodiment of the invention;
Figure 6 is a schematic representation of an architectural stack employed by the navigation apparatus of Figure 5;
Figure 7 is a schematic diagram of an image processing module of Figure 6 in greater detail; and
Figure 8 is a flow diagram of a method of rendering a fagade of a building constituting yet another embodiment of the invention.
Detailed Description of Preferred Embodiments
Throughout the following description identical reference numerals will be used to identify like parts.
Example embodiments of the present disclosure may be described with particular reference to a navigation device (ND) or personal navigation device (PND). It should be remembered, however, that the teachings of the present disclosure are not limited to NDs or PNDs, but are instead universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality. It follows, therefore, that in the context of the present application, a navigation device is intended to include (without limitation) any type of route planning and navigation device, irrespective of whether that device is embodied as a PND, a navigation device built into a vehicle, or a computing resource (such as a desktop or portable personal computer (PC), mobile telephone or portable digital assistant (PDA)) executing route planning and navigation software.
Moreover, while example embodiments described herein make use of GPS measurements including latitude and longitude coordinates as location measurements, it should be understood that location measurements may be obtained from any source and are not limited to GPS. For example, Wi-Fi access points or cellular communications networks can be used.
Referring to Figure 1 , an overview is given of a computing arrangement 100 comprising a processing resource 102, for example a processor, such as a microprocessor.
The processor 102 is coupled to a plurality of storage components, including a hard disk drive 104, a Read Only Memory (ROM) 106, a digital memory, for example a flash memory 108, and a Random Access Memory (RAM) 1 10. Not all of the memory types described above need necessarily be provided. Moreover, these storage components need not be located physically close to the processor 102, and can be located remotely from the processor 102.
The processor 102 is also coupled to one or more input devices for inputting instructions and data by a user, for example a keyboard 1 12 and a mouse 1 14. Other input devices, for example a touch screen input unit, a trackball and/or a voice recognition unit, or any other input device, known to persons skilled in the art, can also be provided.
A removable media unit 1 16 coupled to the processor 102 is provided. The removable media unit 1 16 is arranged to read data from and possibly write data to a removable data carrier or removable storage medium, for example a Compact Disc- ReWritable (CD-RW) disc 1 18. In other examples, the removable data carriers can be, for example: tapes, DVDs, CD-Rs, DVD-Rs, CD-ROMs, DVD-RAMs, SD Cards or memory sticks as is known to persons skilled in the art.
The processor 102 can be coupled to a printer 120 for printing output data on paper, as well as being coupled to a display 122, for instance, a monitor, such as an LCD (Liquid Crystal Display) monitor, or any other type of display known to persons skilled in the art. The processor 102 can also be coupled to a loudspeaker 124. Furthermore, the processor 102 can be coupled to a communications network 126, such as a data network, by means of a data communications interface 128. The processor 102 can therefore be arranged to communicate with other communication-enabled equipment through the network 126.
The data carrier 1 18 can comprise a computer program product in the form of data and/or instructions arranged to provide the processor 102 with the capacity to perform a method as described later herein. However, such a computer program product may, alternatively, be downloaded via the communications network 126.
The processing resource 102 can be implemented as a standalone system, or as a plurality of parallel operating processors each arranged to carry out sub-tasks of a larger computer program, or as one or more main processors with several sub- processors.
Furthermore, parts of the functionality described herein can even be carried out by remote processors communicating with the processor 102 through the communications network 126.
The components contained in the computing arrangement 100 of Figure 1 are those typically found in general purpose computer systems, and are intended to represent a broad category of such computer components that are well known in the art. Thus, the computing arrangement 100 of Figure 1 can be a Personal Computer (PC), workstation, minicomputer, mainframe computer, etc. The computer can also include different bus configurations, networked platforms, multi-processor platforms, etc. Various operating systems can be used including UNIX, Solaris, Linux, Windows, Macintosh OS, or any other suitable operating system.
Referring to Figure 2, a geographic database management facility 200 includes a collection facility 202. The collection facility 202 collects geocoded image sequences 106 or any other suitable data from which a fagade of a building, or other geopositional feature, can be determined.
The technique employed for identifying a fagade for a building from an image sequence is known, for example as described in International patent application no. PCT/EP2005/055317. However, for completeness, the skilled person should understand that the geocoded image sequences are processed by an image analysis module 204, in overview, in the following manner.
Original map data is used in combination with geocoded image sequences in order to generate a geospatial database 206. The original map data 104 is a collection of one or more files making up a map database. The original map data 104 includes geocoded digital 2D city maps, including building footprint information and corresponding geoposition information. The geoposition information in geocoded digital 2D city maps corresponds to the location coordinates of objects, such as XY coordinates or the like. The geocoded image sequences obtained by the collection facility 202 are, in this example, image sequences obtained with a mobile mapping vehicle or the like. The mobile mapping vehicle, for example a delivery van or multi-purpose vehicle, has a cluster of image sensors mounted externally. The image sensors can be in the form of cameras such as CCD cameras. At least one pair of the image sensors can be a stereoscopic pair. Information concerning the precise position and orientation of the vehicle is obtained from GPS data and an inertial system. The image sensors provide a number of overlapping images of all features of interest in the vicinity of the vehicle. These images are stored for later processing as will be described later herein. Furthermore, the position of the image sensors with respect to each other is accurately determined and the orientation of the image sensors with respect to the vehicle. This information is digitally stored as camera calibration information in a file. The GPS determines accurately the geoposition of the vehicle. In combination with the camera calibration information, the geoposition of the image sensors is determined. A processor, for example a personal computer, combines geopositions with the image sequences, to enable the determination of the exact geopositions of each of the images. While driving the road network, the image sequences are being captured and the corresponding geocoded information is added. An example of a mobile mapping vehicle and operation thereof is described in greater detail in "Mobile Mapping by a Car-Driven Survey System (CDSS)" (Wilhelm Benning, Thomas Aussems, October 29, 2000, Geodatisches Institut der RWTH Aachen 1998).
The original map data and the geocoded image sequences are stored on a processor readable storage medium, for example the data carrier 1 18. The collection facility 202 receives the original map data and the geocoded image sequences and determines attribute data associated with buildings. In this respect, the analysis module 204 can determine the height of the building from the extracted images using, for example, triangulation. This is possible, because the distance between the locations of the cameras is known and stored as part of the calibration of the mobile mapping vehicle. The height information is an example of an attribute of the building. Another attribute can be the width of the building or the colour of a fagade of the building.
From an image of the image sequences, the fagade of a building is selected. This can be achieved in an automated manner, for example using image processing software or manually by a Digital Map Technician (DMT). In an automated embodiment, the image can be selected by means of the geocoded information and the camera calibration information of the image sequences in combination with the geopositions of the boundary for which a detailed facade has to be generated.
A library of fagade images 208 is also provided. The library of fagade images comprises a plurality of images of fagades of buildings according to category of building, for example a Georgian fagade, which would have a "Georgian" as its category. The category can, in another embodiment, comprise a sub-category for example to identify a particular style of Georgian fagade. In this example, the fagade images are graphically drawn images, for example "cartoons" of fagades, although the skilled person should appreciate that the fagades can be images obtained, for example, using the mobile mapping vehicle mentioned above.
In operation (Figure 3), once an image associated with the building has been obtained from the image sequence (Step 300), the analysis module 204 analyses (Step 302) the selected image in order to identify the fagade of the building. If a fagade is not identified (Step 304) from the image of the building extracted, the analysis module repeats the above steps of extracting images and trying to identify a fagade from the extracted image. However, if a fagade has been identified, the analysis module 204 then attempts to categorise (Step 306) the fagade identified by referencing the library 208 in an automated manner using an image matching technique, or with the assistance of the DMT when manual comparison is required. Once the category of fagade of the building has been identified, the original map data is augmented or supplemented (Step 308) with the categorisation data and stored (Step 310) in the geospatial database 206. Thereafter, the analysis module 204 determines (Step 312) whether further image sequences need to be analysed in order to categorise fagades of buildings. In the event that further buildings exist that can be categorised, the above process is repeated. Otherwise, the process terminates.
The augmented geospatial database 206 can be used by various devices and/or systems to provide three dimensional (3D) views of buildings. In this respect, the generation of a three dimensional view of a building will now be described in the context of the navigation apparatus 200 being used to assist a user to navigate from a first location to a second location.
With the above provisos in mind, a Global Positioning System (GPS) of Figure 4 and the like are used for a variety of purposes. In general, the GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users. Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be, determined with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver uses the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal allows the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
As shown in Figure 4, the GPS system 500 comprises a plurality of satellites 102 orbiting the earth 504. A GPS receiver 506 receives spread spectrum GPS satellite data signals 508 from a number of the plurality of satellites 502. The spread spectrum data signals 508 are continuously transmitted from each satellite 502, the spread spectrum data signals 508 transmitted each comprise a data stream including information identifying a particular satellite 502 from which the data stream originates. As mentioned above, the GPS receiver 506 generally requires spread spectrum data signals 508 from at least three satellites 502 in order to be able to calculate a two-dimensional position. Receipt of a fourth spread spectrum data signal enables the GPS receiver 506 to calculate, using a known technique, a three-dimensional position.
Referring to Figure 6, it should be noted that the block diagram of the navigation apparatus 200 is not inclusive of all components of the navigation apparatus, but is only representative of many example components. The navigation apparatus 200 is located within a housing (not shown). The navigation apparatus 200 includes a processing resource, for example a processor 602, the processor 602 being coupled to an input device 604 and a display device, for example a display screen 606. Although reference is made here to the input device 604 in the singular, the skilled person should appreciate that the input device 604 represents any number of input devices, including a keyboard device, voice input device, touch panel and/or any other known input device utilised to input information. Likewise, the display screen 606 can include any type of display screen such as a Liquid Crystal Display (LCD), for example.
In one arrangement, one aspect of the input device 604, the touch panel, and the display screen 606 are integrated so as to provide an integrated input and display device, including a touchpad or touchscreen input to enable both input of information (via direct input, menu selection, etc.) and display of information through the touch panel screen so that a user need only touch a portion of the display screen 606 to select one of a plurality of display choices or to activate one of a plurality of virtual or "soft" buttons. In this respect, the processor 602 supports a Graphical User Interface (GUI) that operates in conjunction with the touchscreen.
In the navigation apparatus 200, the processor 602 is operatively connected to and capable of receiving input information from input device 604 via a connection 610, and operatively connected to at least one of the display screen 606 and an output device 608, via respective output connections 612, to output information thereto. The output device 608 is, for example, an audible output device (e.g. including a loudspeaker). As the output device 608 can produce audible information for a user of the navigation apparatus 200, it should equally be understood that input device 604 can include a microphone and software for receiving input voice commands as well. Further, the navigation apparatus 200 can also include any additional input device 604 and/or any additional output device, such as audio input/output devices. The processor 602 is operably coupled to a memory resource 614 via connection 616 and is further adapted to receive/send information from/to input/output (I/O) ports 618 via connection 620, wherein the I/O port 618 is connectible to an I/O device 622 external to the navigation apparatus 200. The external I/O device 622 may include, but is not limited to an external listening device, such as an earpiece for example. The connection to I/O device 622 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an earpiece or headphones. The memory resource 614 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non- volatile memory, for example a digital memory, such as a flash memory.
Figure 5 further illustrates an operative connection between the processor 602 and an antenna/receiver 624 via connection 626, wherein the antenna/receiver 624 can be a GPS antenna/receiver for example. It should be understood that the antenna and receiver designated by reference numeral 624 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna can be a GPS patch antenna or helical antenna for example.
In order to support communications in a Universal Mobile Telecommunications System (UMTS), the processor 602 is also coupled to a cellular communications module 628 constituting the mobile telephone technology. The cellular communications module 628 supports a communications interface 629 for transmitting and receiving data wirelessly. The cellular communications module 628 comprises a Subscriber Identity Module (SIM) (not shown) coupled thereto having a data subscription associated therewith. The subscription is, in this example, for a limited data usage over a predetermined period of time, for example a calendar month. In other embodiments, the subscription need not have a data usage limit. The cellular communications module 628 supports a bidirectional data communications service, for example a packet switched data service, such as a General Packet Radio Service (GPRS) supported by the GSM communications network and/or a High Speed Downlink Packet Access (HSDPA) service supported by the UMTS network. The communications interface 629 is therefore compatible with the bidirectional data communications service. The bidirectional data communications service supports an Internet Protocol (IP) for data communications although use of other protocols, additionally or alternatively, is contemplated.
In this example, the navigation apparatus 200 comprises the cellular communications module 628. However, in another embodiment, a data session can be established, if required, with the communications network via a separate wireless communications terminal (not shown), such as a mobile telephone, PDA, and/or any device with mobile telephone technology, in order to establish a digital connection, for example a digital connection via known Bluetooth technology. In this respect, the navigation apparatus 200 can be Bluetooth enabled in order that the navigation apparatus 200 can be agnostic to the settings of the wireless communications terminal, thereby enabling the navigation apparatus 200 to operate correctly with the ever changing range of mobile telephone models, manufacturers, etc. Model/manufacturer specific settings can, for example, be stored by the navigation apparatus 200, if desired. The data stored for this information can be updated.
It will, of course, be understood by one of ordinary skill in the art that the electronic components shown in Figure 6 are powered by one or more power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in Figure 5 are contemplated. For example, the components shown in Figure 5 can be in communication with one another via wired and/or wireless connections and the like. Thus, the navigation apparatus 200 described herein can be a portable or handheld navigation apparatus.
Turning to Figure 6, the memory resource 614 of the navigation apparatus 200 stores a boot loader program (not shown) that is executed by the processor 602 in order to load an operating system 638 from the memory resource 614 for execution by functional hardware components 636, which provides an environment in which application software 640 can run. The operating system 638 serves to control the functional hardware components 636 and resides between the application software 640 and the functional hardware components 636. The application software 640 provides an operational environment including the GUI that supports core functions of the navigation apparatus 200, for example map viewing, route planning, navigation functions and any other functions associated therewith. In this example, in order to render images of buildings in 3D, the application software 640 supports an image generation module 642.
Referring to Figure 7, the image generation 642 comprises an image generation engine 650. The image generation engine 650 is capable of communicating with a location determination module 652 that is also supported by the application software. In this respect, the location determination module 652 uses data obtained from the GPS receiver in order to determine the location of the navigation apparatus 200, which is an example of a mobile computing apparatus. The image generation engine 650 is also operably coupled to the display device 606. Furthermore, the image generation engine 650 is also capable of accessing a map database 654, the map database having been derived from the geospatial database 206 mentioned above. In this example, the map database 654 identifies buildings and also comprises category data associated with the respective fagades of the buildings and respective attribute data associated with the fagades of the buildings.
Fagade images associated with the categories of fagades are provided in a local library of fagade images 656. However, the skilled person should appreciate that, if required, the library of fagade images 656 can be stored remotely and retrieved via a communications network.
In operation (Figure 8), the image generation engine 650 obtains (Step 800) a current location of the navigation apparatus 200 from the location determination module 652. The image generation engine 650 then, as part of a suitable algorithm for presenting the geospatial features in relation to an area surrounding the current location, accesses (Step 802) the map database 654 and obtains geospatial data required to generate images of geospatial features in the vicinity of the current location. In relation to a building, the category of the fagade and any attributes of the fagade, for example height, width and/or colour of the fagade, are obtained from the map database 654. The image generation engine then accesses (Step 804) the library of fagade images 656 and retries a graphical representation of the fagade associated with the category retrieved from the map database 654 in respect of the building.
The image generation engine then uses, the attributes and the image retrieved from the library 656 in order to generate (Step 806) a 3D representation of the building.
3D representations of buildings can be made with varying levels of detail.
Examples of these varying levels are described in "Navigate by Maps for Multi-Modal Transport" (Vande Velde, Linde, Intelligent Transportation Systems (ITS) Madrid 2003). A first, aesthetically basic, level of detail is a so-called generic block model or building "profile". The "profile" can be generated using any technique known in the art, e.g. as described in WO 2007/045272. For example, the profile can be generated from the 2D footprint of a building (e.g. which are commonly stored with digital maps). However, in order to arrive at a quasi-realistic representation of a building, texture can be assigned to the blocks. In this respect, in another embodiment, the library 656 can comprise textures constituting fagade images (and hence can have respective categories associated therewith) and a texture mapping technique can be employed to map the texture onto the building profile.
In the present example, this process is repeated for any other buildings that need to be displayed by the navigation apparatus 200. Although, in this example, the library of fagade images comprises images of whole fagades, the skilled person should appreciate that the fagade image stored can be a fagade component image that is a repeatably reproducible image for constructing the fagade image associated with the building.
In the above example, the navigation apparatus 200 is being used to generate a representation of one or more buildings without providing navigation assistance, for example when a user engages in the so-called "free driving" mentioned above. However, the skilled person should appreciate that the representation of build fagades as described above can be employed in the context of the navigation apparatus 200 providing navigation assistance between a first specified location and a second specified location. Similarly, the skilled person should appreciate that the embodiments described herein are not exclusively applicable to navigation apparatus and can be employed in relation to other portable computing apparatus for which it is desirable to display buildings in three dimensions. Indeed, implementations are not limited to portable computing apparatus and the embodiments described herein can be applied to client- server topologies.
Although the above embodiments have been described in the context of a building, the skilled person should appreciate that the above embodiments are applicable to any suitable geospatial feature or object, for example any appropriate structure, or portion thereof, such as a roof or a wall.
Whilst embodiments described in the foregoing detailed description refer to GPS, it should be noted that the navigation apparatus may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) the GPS. For example the navigation apparatus may utilise other global navigation satellite systems (GNSS) such as the proposed European Galileo system when available. Equally, it is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location, for example the long range navigation (LORAN)-C system.
Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.
It will also be well understood by persons of ordinary skill in the art that whilst the preferred embodiment implements certain functionality by means of software, that functionality could equally be implemented solely in hardware (for example by means of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software. As such, the scope of the present invention should not be interpreted as being limited only to being implemented in software.
Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present invention is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

1 . A method of generating fagade data for a geospatial database, the method comprising:
collecting image data relating to the fagade of a building;
categorising the fagade of the building from the collected image data by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories;
providing a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and
recording an association between the building and the fagade category for access of the fagade image.
A method as claimed in Claim 1 , further comprising:
determining a first attribute of the fagade of the building; and
recording the first attribute in respect of the building.
3. A method as claimed in Claim 2, further comprising:
determining a second attribute of the fagade of the building; and
recording the second attribute in respect of the building.
4. A method as claimed in Claim 1 , wherein the selection of the fagade category is in response to obtaining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
5. A method as claimed in Claim 2, wherein the first attribute is a height associated with the fagade of the building or the first attribute is a colour associated with the fagade of the building.
6. A method as claimed in Claim 3, wherein the second attribute is a colour associated with the fagade of the building or the second attribute is a height associated with the fagade of the building.
7. A method as claimed in Claim 1 , wherein the fagade category comprises at least one sub-category.
8. A method as claimed in Claim 1 , wherein the fagade image is a fagade component image, the fagade component image being a repeatably reproducible image for constructing the fagade image associated with the building.
9. A method as claimed in Claim 1 , further comprising:
using an electronic storage device to store the association between the building and the fagade category.
10. A method as claimed in Claim 1 , wherein the plurality of fagade images of the library of fagade images are stored as one or more textures.
1 1 . A building fagade information generation system comprising:
an input arranged to receive image data relating to the fagade of a building; a processing resource arranged to support an analyser module, the analyser module being arranged to categorise the fagade of the building by selecting a fagade category to be associated with the building, the fagade category being selected from a plurality of predetermined fagade categories;
a library of fagade images comprising a plurality of facade images, the fagade category having at least one fagade image associated therewith; and
a geospatial database identifying the building; wherein
the analyser module is arranged to update the geospatial database with an association between the building and fagade category for access of the fagade image.
12. A system as claimed in Claim 1 1 , further comprising:
determining a at least one attribute of the fagade of the building; and
recording the at least one attribute in respect of the building.
13. A system as claimed in Claim 1 1 , wherein the analyser module is arranged to select the fagade category in response to determining a substantial image match between the fagade of the building and a fagade image associated with the selected fagade category.
14. A system as claimed in Claim 1 1 , wherein the at least one attribute is one of: a height associated with the fagade of the building; and a colour associated with the fagade of the building.
15. A system as claimed in Claim 1 1 , further comprising:
an electronic storage device arranged to store the association between the building and the fagade category.
16. A system as claimed in Claim 1 1 , wherein the plurality of fagade images of the library of fagade images comprise one or more textures.
17. A mobile computing apparatus comprising:
a processing resource arranged to support, when in use, an operational environment, the operational environment supporting a location determination module and an image generation module; and
a map database comprising geospatial data associated with a building and building fagade category data associated with the building; wherein
the location determination module is arranged to determine a current location; the image generation module is arranged to obtain the current location from the location determination module and to determine that the building needs to be displayed in three dimensions;
the image generation module accesses the map database and uses the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and
the image generation module is arranged to generate an image of the building as a three dimensional view using the fagade image.
18. A navigation apparatus comprising the mobile computing apparatus as claimed in Claim 17.
19. A method of rendering a fagade of a building for display by a mobile computing apparatus, the method comprising:
the mobile computing apparatus determining a current location;
an image generation module obtaining the current location and determining that a building needs to be displayed in three dimensions;
accessing a map database, the map database comprising geospatial data associated with the building and building fagade category data associated with the building;
the image generation module using the building fagade category data stored in relation to the building in order to access a fagade image from a library of building fagade images, the library of building fagade images comprising at least one fagade image associated with the building fagade category; and
the image generation module generating an image of the building as a three dimensional view using the fagade image.
20. A method as claimed in Claim 19, wherein
the image generation module obtains a first attribute from the map database, the attribute being associated with the fagade of the building; and
the image generation module adapts the fagade image in accordance with the first attribute.
21 . A method as claimed in Claim 20, wherein the first attribute is a height associated with the fagade of the building or the first attribute is a colour associated with the fagade of the building.
22. A method as claimed in Claim 20, wherein
the image generation module obtains a second attribute from the map database, the attribute being associated with the fagade of the building; and
the image generation module adapts the fagade image in accordance with the second attribute.
23. A method as claimed in Claim 22, wherein the second attribute is a height associated with the fagade of the building or the second attribute is a colour associated with the fagade of the building.
24. A method as claimed in Claim 19, wherein
the plurality of fagade images of the library of fagade images comprise one or more textures; and
the generation of the image of the building comprises mapping the texture or textures onto a building profile.
PCT/US2010/061957 2009-12-23 2010-12-23 Method of generating building facade data for a geospatial database for a mobile device WO2011079241A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US28942409P 2009-12-23 2009-12-23
US61/289,424 2009-12-23

Publications (1)

Publication Number Publication Date
WO2011079241A1 true WO2011079241A1 (en) 2011-06-30

Family

ID=44196149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/061957 WO2011079241A1 (en) 2009-12-23 2010-12-23 Method of generating building facade data for a geospatial database for a mobile device

Country Status (1)

Country Link
WO (1) WO2011079241A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140369595A1 (en) * 2008-11-05 2014-12-18 Hover, Inc. Computer vision database platform for a three-dimensional mapping system
US9437044B2 (en) 2008-11-05 2016-09-06 Hover Inc. Method and system for displaying and navigating building facades in a three-dimensional mapping system
US9437033B2 (en) 2008-11-05 2016-09-06 Hover Inc. Generating 3D building models with ground level and orthogonal images
US9830681B2 (en) 2014-01-31 2017-11-28 Hover Inc. Multi-dimensional model dimensioning and scale error correction
US9836881B2 (en) 2008-11-05 2017-12-05 Hover Inc. Heat maps for 3D maps
US9934608B2 (en) 2015-05-29 2018-04-03 Hover Inc. Graphical overlay guide for interface
US10038838B2 (en) 2015-05-29 2018-07-31 Hover Inc. Directed image capture
US10127721B2 (en) 2013-07-25 2018-11-13 Hover Inc. Method and system for displaying and navigating an optimal multi-dimensional building model
US10133830B2 (en) 2015-01-30 2018-11-20 Hover Inc. Scaling in a multi-dimensional building model
US10178303B2 (en) 2015-05-29 2019-01-08 Hover Inc. Directed image capture
US10410412B2 (en) 2015-05-29 2019-09-10 Hover Inc. Real-time processing of captured building imagery
US10410413B2 (en) 2015-05-29 2019-09-10 Hover Inc. Image capture for a multi-dimensional building model
US10861224B2 (en) 2013-07-23 2020-12-08 Hover Inc. 3D building analyzer
US11574439B2 (en) 2013-07-23 2023-02-07 Hover Inc. Systems and methods for generating three dimensional geometry
US11721066B2 (en) 2013-07-23 2023-08-08 Hover Inc. 3D building model materials auto-populator
US11790610B2 (en) 2019-11-11 2023-10-17 Hover Inc. Systems and methods for selective image compositing
CN117454253A (en) * 2023-12-08 2024-01-26 深圳市蕾奥规划设计咨询股份有限公司 Building classification method, device, terminal equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060110026A1 (en) * 2002-07-10 2006-05-25 Marek Strassenburg-Kleciak System for generatingthree-dimensional electronic models of objects
US20080221843A1 (en) * 2005-09-01 2008-09-11 Victor Shenkar System and Method for Cost-Effective, High-Fidelity 3D-Modeling of Large-Scale Urban Environments
US20090273601A1 (en) * 2008-04-30 2009-11-05 Core Logic, Inc. Image Presentation Method and Apparatus for 3D Navigation and Mobile Device Including the Apparatus
US20090281728A1 (en) * 2008-05-08 2009-11-12 Microsoft Corporation Navigation with contextual color, texture, and structure cues

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060110026A1 (en) * 2002-07-10 2006-05-25 Marek Strassenburg-Kleciak System for generatingthree-dimensional electronic models of objects
US20080221843A1 (en) * 2005-09-01 2008-09-11 Victor Shenkar System and Method for Cost-Effective, High-Fidelity 3D-Modeling of Large-Scale Urban Environments
US20090273601A1 (en) * 2008-04-30 2009-11-05 Core Logic, Inc. Image Presentation Method and Apparatus for 3D Navigation and Mobile Device Including the Apparatus
US20090281728A1 (en) * 2008-05-08 2009-11-12 Microsoft Corporation Navigation with contextual color, texture, and structure cues

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11113877B2 (en) 2008-11-05 2021-09-07 Hover Inc. Systems and methods for generating three dimensional geometry
US9953459B2 (en) 2008-11-05 2018-04-24 Hover Inc. Computer vision database platform for a three-dimensional mapping system
US9437033B2 (en) 2008-11-05 2016-09-06 Hover Inc. Generating 3D building models with ground level and orthogonal images
US10769847B2 (en) 2008-11-05 2020-09-08 Hover Inc. Systems and methods for generating planar geometry
US9836881B2 (en) 2008-11-05 2017-12-05 Hover Inc. Heat maps for 3D maps
US11741667B2 (en) 2008-11-05 2023-08-29 Hover Inc. Systems and methods for generating three dimensional geometry
US9437044B2 (en) 2008-11-05 2016-09-06 Hover Inc. Method and system for displaying and navigating building facades in a three-dimensional mapping system
US11574441B2 (en) 2008-11-05 2023-02-07 Hover Inc. Systems and methods for generating three dimensional geometry
US10776999B2 (en) 2008-11-05 2020-09-15 Hover Inc. Generating multi-dimensional building models with ground level images
US10643380B2 (en) 2008-11-05 2020-05-05 Hover, Inc. Generating multi-dimensional building models with ground level images
US11574442B2 (en) 2008-11-05 2023-02-07 Hover Inc. Systems and methods for generating three dimensional geometry
US20140369595A1 (en) * 2008-11-05 2014-12-18 Hover, Inc. Computer vision database platform for a three-dimensional mapping system
US10867437B2 (en) 2013-06-12 2020-12-15 Hover Inc. Computer vision database platform for a three-dimensional mapping system
US11954795B2 (en) 2013-06-12 2024-04-09 Hover Inc. Computer vision database platform for a three-dimensional mapping system
US10861224B2 (en) 2013-07-23 2020-12-08 Hover Inc. 3D building analyzer
US11670046B2 (en) 2013-07-23 2023-06-06 Hover Inc. 3D building analyzer
US11276229B2 (en) 2013-07-23 2022-03-15 Hover Inc. 3D building analyzer
US11574439B2 (en) 2013-07-23 2023-02-07 Hover Inc. Systems and methods for generating three dimensional geometry
US11935188B2 (en) 2013-07-23 2024-03-19 Hover Inc. 3D building analyzer
US11721066B2 (en) 2013-07-23 2023-08-08 Hover Inc. 3D building model materials auto-populator
US10902672B2 (en) 2013-07-23 2021-01-26 Hover Inc. 3D building analyzer
US10657714B2 (en) 2013-07-25 2020-05-19 Hover, Inc. Method and system for displaying and navigating an optimal multi-dimensional building model
US10127721B2 (en) 2013-07-25 2018-11-13 Hover Inc. Method and system for displaying and navigating an optimal multi-dimensional building model
US11783543B2 (en) 2013-07-25 2023-10-10 Hover Inc. Method and system for displaying and navigating an optimal multi-dimensional building model
US10977862B2 (en) 2013-07-25 2021-04-13 Hover Inc. Method and system for displaying and navigating an optimal multi-dimensional building model
US9830681B2 (en) 2014-01-31 2017-11-28 Hover Inc. Multi-dimensional model dimensioning and scale error correction
US10297007B2 (en) 2014-01-31 2019-05-21 Hover Inc. Multi-dimensional model dimensioning and scale error correction
US11676243B2 (en) 2014-01-31 2023-06-13 Hover Inc. Multi-dimensional model reconstruction
US11017612B2 (en) 2014-01-31 2021-05-25 Hover Inc. Multi-dimensional model dimensioning and scale error correction
US11030823B2 (en) 2014-01-31 2021-06-08 Hover Inc. Adjustment of architectural elements relative to facades
US10453177B2 (en) 2014-01-31 2019-10-22 Hover Inc. Multi-dimensional model dimensioning and scale error correction
US10475156B2 (en) 2014-01-31 2019-11-12 Hover, Inc. Multi-dimensional model dimensioning and scale error correction
US10515434B2 (en) 2014-01-31 2019-12-24 Hover, Inc. Adjustment of architectural elements relative to facades
US10133830B2 (en) 2015-01-30 2018-11-20 Hover Inc. Scaling in a multi-dimensional building model
US10410412B2 (en) 2015-05-29 2019-09-10 Hover Inc. Real-time processing of captured building imagery
US9934608B2 (en) 2015-05-29 2018-04-03 Hover Inc. Graphical overlay guide for interface
US10038838B2 (en) 2015-05-29 2018-07-31 Hover Inc. Directed image capture
US11070720B2 (en) 2015-05-29 2021-07-20 Hover Inc. Directed image capture
US10178303B2 (en) 2015-05-29 2019-01-08 Hover Inc. Directed image capture
US10803658B2 (en) 2015-05-29 2020-10-13 Hover Inc. Image capture for a multi-dimensional building model
US11574440B2 (en) 2015-05-29 2023-02-07 Hover Inc. Real-time processing of captured building imagery
US11729495B2 (en) 2015-05-29 2023-08-15 Hover Inc. Directed image capture
US11538219B2 (en) 2015-05-29 2022-12-27 Hover Inc. Image capture for a multi-dimensional building model
US10410413B2 (en) 2015-05-29 2019-09-10 Hover Inc. Image capture for a multi-dimensional building model
US10681264B2 (en) 2015-05-29 2020-06-09 Hover, Inc. Directed image capture
US10713842B2 (en) 2015-05-29 2020-07-14 Hover, Inc. Real-time processing of captured building imagery
US11790610B2 (en) 2019-11-11 2023-10-17 Hover Inc. Systems and methods for selective image compositing
CN117454253A (en) * 2023-12-08 2024-01-26 深圳市蕾奥规划设计咨询股份有限公司 Building classification method, device, terminal equipment and storage medium
CN117454253B (en) * 2023-12-08 2024-04-02 深圳市蕾奥规划设计咨询股份有限公司 Building classification method, device, terminal equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2011079241A1 (en) Method of generating building facade data for a geospatial database for a mobile device
US10393538B2 (en) Navigation apparatus and method of providing weather condition information
US10168175B2 (en) Navigation apparatus, server apparatus and method of collecting parking location information
US8756000B2 (en) Navigation apparatus and method of detection that a parking facility is sought
EP2427727B1 (en) Location point determination apparatus, map generation system, navigation apparatus and method of determining a location point
US8825357B2 (en) Navigation device and method
EP2649412B1 (en) Mapping apparatus and method of operation thereof
US20120259478A1 (en) Satellite signal acquisition apparatus, navigation apparatus and method of acquiring a satellite signal
US20160054137A1 (en) Navigation device with enhanced widgets and applications
WO2008083978A1 (en) Improved navigation device and method
WO2010081545A1 (en) Navigation apparatus, server apparatus and method of providing an indication of likelihood of occupancy of a parking location
US8886455B2 (en) Navigation apparatus, audible instruction generation system and method of generating audible instructions
EP2406583B1 (en) Apparatus for enriching a representation of a parking location and method of enriching a representation of a parking location
US20140297177A1 (en) Navigation device having dead reckoning navigation functionality and method thereof
WO2011095226A1 (en) Apparatus and method for generating a view
WO2010081538A2 (en) Navigation device & method
JP2006275656A (en) Navigation system, navigation method, and navigation program
TW201232474A (en) Method of generating facade data for a geospatial database, building facade information generation system, mobile computing apparatus and method of rendering a facade of a building

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10840146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10840146

Country of ref document: EP

Kind code of ref document: A1