WO2011095226A1 - Appareil et procédé de génération d'une vue - Google Patents

Appareil et procédé de génération d'une vue Download PDF

Info

Publication number
WO2011095226A1
WO2011095226A1 PCT/EP2010/051493 EP2010051493W WO2011095226A1 WO 2011095226 A1 WO2011095226 A1 WO 2011095226A1 EP 2010051493 W EP2010051493 W EP 2010051493W WO 2011095226 A1 WO2011095226 A1 WO 2011095226A1
Authority
WO
WIPO (PCT)
Prior art keywords
footprint
navigation
roof
processing resource
mapping apparatus
Prior art date
Application number
PCT/EP2010/051493
Other languages
English (en)
Inventor
Witold Studzinski
Original Assignee
Tomtom Polska Sp.Z.O.O
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom Polska Sp.Z.O.O filed Critical Tomtom Polska Sp.Z.O.O
Priority to PCT/EP2010/051493 priority Critical patent/WO2011095226A1/fr
Publication of WO2011095226A1 publication Critical patent/WO2011095226A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a mapping system or navigation apparatus of the type that, for example, provides a three-dimensional model or view in respect of a location.
  • Portable computing devices for example Portable Navigation Devices (PNDs) that include GPS (Global Positioning System) signal reception and processing functionality are well known and are widely employed as in-car or other vehicle navigation systems.
  • GPS Global Positioning System
  • a modern PND comprises a processor, memory (at least one of volatile and non-volatile, and commonly both), and map data stored within said memory.
  • the processor and memory cooperate to provide an execution environment in which a software operating system may be established , and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PN D to be controlled, and to provide various other functions.
  • these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user.
  • output interfaces include a visual d isplay and a speaker for audible output.
  • input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but could be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech.
  • the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) additionally to provide an input interface by means of which a user can operate the device by touch.
  • Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to a n d re ce i ve d fro m t h e d ev i ce , a n d o pt i o n a l l y o n e o r m o re wi re l es s transmitters/receivers to allow communication over cellular telecommunications and other signal and data networks, for example Bluetooth, Wi-Fi, Wi-Max, GSM, UMTS and the like.
  • PNDs of this type also include a GPS antenna by means of which satellite- broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.
  • the PND may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • PNDs The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.
  • the PND is enabled by software for computing a “best” or “optimum” route between the start and destination address locations from the map data.
  • a “best” or “optimum” route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route.
  • the selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical information about road speeds, and the driver's own preferences for the factors determining road choice (for example the driver may specify that the route should not include motorways or toll roads).
  • the device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions.
  • Real time traffic monitoring systems based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking) are being used to identify traffic delays and to feed the information into notification systems.
  • PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself.
  • the navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant), a media player, a mobile phone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.
  • PDA Portable Digital Assistant
  • Mapping, route planning and navigation functionality may also be provided by a desktop or mobile computing resource running appropriate software.
  • the Royal Automobile Club (RAC) provides an on-line route planning and navigation facility at http://www.rac.co.uk, which facility allows a user to enter a start point and a destination whereupon the server with which the user's computing resource is communicating calculates a route (aspects of which may be user specified), generates a map, and generates a set of exhaustive navigation instructions for guiding the user from the selected start point to the selected destination.
  • the facility also provides for pseudo rendering of a calculated route, and route preview functionality which simulates a user travelling along the route and thereby provides the user with a preview of the calculated route.
  • the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes.
  • the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey.
  • the route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function.
  • PNDs During navigation along a calculated route, it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination. It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in-vehicle navigation.
  • An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information , examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn.
  • the navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated a simple instruction such as "turn left in 100 m" requires significant processing and analysis.
  • user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method.
  • a further important function provided by the device is automatic route recalculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.
  • a route to be calculated with user defined criteria; for example, the user may prefer a scenic route to be calculated by the device, or may wish to avoid any roads on which traffic congestion is likely, expected or currently prevailing.
  • the device software would then calculate various routes and weigh more favourably those that include along their route the highest number of points of interest (known as POIs) tagged as being for example of scenic beauty, or, using stored information indicative of prevailing traffic conditions on particular roads, order the calculated routes in terms of a level of likely congestion or delay on account thereof.
  • POIs points of interest
  • Other PO I-based and traffic information-based route calculation and navigation criteria are also possible.
  • route calculation and navigation functions are fundamental to the overall utility of PN Ds, it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance.
  • the device may be used to display map information relating to a location input by the user, wherein the location may have no relation to the user's current location.
  • Devices of the type d escri bed above for exam ple the 920T model manufactured and supplied by TomTom International B.V., provide a reliable means for enabling users to navigate from one position to another. Such devices are of great utility when the user is not familiar with the route to the destination to which they are navigating.
  • the memory of the PND stores map data used by the PND not only to calculate routes and provide necessary navigation instructions to users, but also to provide visual information to users through the visual display of the PND.
  • Map information can be expressed in a number of ways and indeed can com prise a nu mber of separate information components, which are used in combination by the PND.
  • information can be expressed as various birds-eye map projections, satellite images, terrain information, field of view images and 3D models. 3D models and field of view images may advantageously give an impression of the view that a user might see if they were at a location, which may help, for example, in recognising the location.
  • 3D models may be defined using 3 dimensional coordinates of vertices that make up the model.
  • some data for texture mapping may also be stored. The result of this is that a considerable amount of data is required in order to generate 3D models.
  • the 3D image or model can take up a large amount of memory or disk space. This is particularly problematic for portable devices or devices in which the memory or other storage space is limited. Furthermore, transmission of model data, for example over a network or communications link, may take up considerable bandwidth and/or may require a long time to transmit.
  • a navigation and/or mapping apparatus comprising: a processing resource operably coupled to a data store, wherein the data store is adapted to store at least one footprint source and the processing resource is adapted to obtain a footprint from the footprint source, generate a model of a structure based on the footprint and provide image data relating to the model.
  • the apparatus may comprise a display device operably coupled to the processing resource, the processing resource supporting, when in use, a view generation engine and the display device being arranged to receive the image data from the view generation engine and display an image responsive to the image data.
  • the footprint and/or the model may be an image.
  • the footprint may be a building footprint.
  • the footprint may define a 2 dimensional representation of the building.
  • the footprint may define a perimeter, outline or border of a building.
  • the perimeter may comprise a series of component lines, which may define at least one polygon.
  • the footprint source may comprise a footprint.
  • the footprint source may comprise a birds-eye view, aerial view, satellite view and/or plan view image.
  • the processing source may be adapted to determine at least one footprint from the footprint source, which may comprise edge detection, pattern matching or the like.
  • the model may be a 3 dimensional (3D) model.
  • 3D models may be virtual 3D models, represented on a flat screen display but whose display comprises a representation of depth and/or may be rotatable to be viewed from any angle.
  • the 3D model may be a 3D model representing at least part of a building.
  • the processing resource may be adapted to convert 2D footprint images into
  • the processing resource may be adapted to generate walls for the model by extrapolating the footprint through a distance corresponding to a wall height.
  • the processing resource may be adapted to determine a roof shape using the footprint.
  • the roof shape may comprise a 3D roof model.
  • the processing resource may be adapted to determine vertices of the footprint.
  • the vertices may comprise a point at which the perimeter changes direction.
  • the vertices may represent a corner of a building.
  • the processing resource may be adapted to determine bisectors for at least one, and preferably each, vertex of the footprint.
  • the bisectors may comprise a straight line passing through a vertex, and extending internally of the footprint and having an equal angle between the bisector and each portion of the footprint forming the vertex.
  • the processing resource may be adapted to determine intersections between bisectors.
  • the processing resource may be adapted to determine only the closest intersection to the associated vertex for at least one, and preferably each, bisector.
  • the processing resource may be adapted to disregard any second and subsequent intersection on at least one, and preferably each, bisector.
  • the processin g resou rce may be adapted to determine the closest intersection or intersections to the perimeter and may be adapted to calculate a shift amount corresponding to the d istance between the closest i ntersection or intersections and the perimeter of the footprint.
  • the processing resource may be adapted to generate lines forming roof features, the generated lines lying parallel to corresponding constituent lines of the footprint, and shifted toward the centre of the footprint by an amount corresponding to the shift distance from a corresponding constituent line of the footprint such that each point on each generated line lies a distance corresponding to the shift distance from the closest point on the footprint.
  • the roof features may comprise one or more polygons, lines or points.
  • the processing resource may be adapted to repeat the above process using the polygon as a starting point rather than the footprint, i.e. the processing resource may be adapted to determine vertices of the polygon, determine bisectors associated with at least one vertex of the polygon, determine intersections between the bisectors, determine a shift distance corresponding to a d istance between the closest intersection to the polygon and generate further roof features by generating further lines lying parallel to corresponding constituent lines of the polygon, and shifted toward the centre of the polygon by an amount corresponding to the shift distance from a corresponding constituent line of the polygon such that each point on each further generated line lies a distance corresponding to the shift distance from the closest point on the polygon.
  • the processing resource may be adapted to iteratively repeat the roof feature generation process until no further polygons are generated, and all polygons have been degenerated to lines or points.
  • the processing resource may be adapted to remove at least one and optionally all polygons and/or bisectors, which may be removed dependent on roof type.
  • the data store may comprise metadata related to the footprint.
  • the processing resource may be adapted to determine metadata related to the footprint.
  • the processing resource may be adapted to determine metadata related to the footprint based on a shape and/or scale of the footprint, which may comprise using typical metadata associated with a shape and/or scale of the footprint.
  • the processing resource may be arranged to determine footprint meta data associated with buildings in a neighbourhood or predetermined distance of the building whose model is to be determined and determining metadata associated with the footprint based on the metadata of the neighbouring buildings.
  • the meta data may comprise wall height and/or a roof height and/or texture information and/or a roof type.
  • the processing resource may be adapted to generate a 3D model using the footprint and metadata related to the footprint.
  • the roof height meta data may comprise a total roof height.
  • the processing resource may be adapted to associate lines and/or points forming the roof features and portions of the original footprint with heights, based on the total roof height and the iteration of the roof features determination process in which the line or point was generated.
  • the processing resource may be adapted to generate a roof model using the generated roof feature and corresponding heights.
  • the processing resource may be adapted to generate walls of the model by extrapolating a footprint upwards by a distance corresponding to the wall height meta data.
  • the processing resource may be adapted to generate a building model by combining the roof model with the walls of the model.
  • the navigation and/or mapping apparatus may comprise a location determination unit operably coupled to the processing resource and adapted to determine a location.
  • the navigation and/or mapping apparatus may comprise a user input device.
  • the navigation and/or mapping apparatus may be adapted to receive an input location, which may be input via the user input device.
  • the processing resource may be adapted to retrieve or determine at least one footprint associated with a location as determined by the location determining unit and/or the input location.
  • the processing resource may be adapted to generate a field of view around a determined or input location using the footprint data associated with the location.
  • the field of view may comprise one or more 3D models.
  • a navigation and/or mapping system comprising: a navigation and/or mapping apparatus as set forth above in relation to the first aspect of the invention; wherein the data store is remotely located from the navigation and/or mapping apparatus and accessible via a communications network.
  • a navigation and/or mapping system comprising: a navigation and/or mapping apparatus as set forth above in relation to the first aspect of the invention and/or a navigation and/or mapping system as set forth above in relation to the second aspect of the invention ; wherein the display is remotely located from the navigation and/or mapping apparatus and/or system and is in communication with the processing resource via a communications network.
  • a method for generating an image comprising: obtaining at least one building footprint; determining a roof height; and generating a 3D roof model using the footprint and the roof height.
  • the method may comprise generating and displaying a 3D model of the building, the building model comprising the roof model.
  • the method may comprise determining vertices of the footprint.
  • the method may comprise determining bisectors for at least one, and preferably each, vertex of the footprint.
  • the method may comprise determining intersections between bisectors.
  • the method may comprise determining only the closest intersection to the associated vertex for at least one, and preferably each, bisector and/or to disregard the second and subsequent intersection on at least one, and preferably each, bisector.
  • the method may comprise determining the closest intersection or intersections to the outline or border.
  • the method may comprise determining the distance between the closest intersection or intersections and the outline or border of the footprint.
  • the method may comprise determining the closest intersection or intersections to the perimeter and calculating a shift amount corresponding to the distance between the closest intersection or intersections and the perimeter of the footprint.
  • the method may comprise generating lines forming roof features, th e generated lines lying parallel to corresponding constituent lines of the footprint, and shifted toward the centre of the footprint by an amount corresponding to the shift distance from a corresponding constituent line of the footprint such that each point on each generated line lies a distance corresponding to the shift distance from the closest point on the footprint.
  • the roof features may comprise one or more polygons, lines or points.
  • the method may comprise repeating the above roof feature generation process using the polygon as a starting point rather than the footprint, i.e. determining vertices of the polygon , determine bisectors associated with at least one vertex of the polygon, determining intersections between the bisectors, determining a shift distance corresponding to a distance between the closest intersection to the polygon and generating further roof features by generating further lines lying parallel to corresponding constituent lines of the polygon , an d sh ifted towa rd the centre of th e polygon by an am ou nt corresponding to the shift distance from a corresponding constituent line of the polygon such that each point on each further generated line lies a distance corresponding to the shift distance from the closest point on the polygon.
  • the method may comprise iteratively repeating the roof feature generation process until no further polygons are generated , and all polygons have been degenerated to lines or points.
  • the method may comprise removing at least one and optionally all polygons and/or bisectors, which may be removed dependent on roof type.
  • the method may comprise determining a wall height and/or a roof height and/or a roof texture and/or roof type.
  • the method may comprise determining a wall height and/or a roof height and/or a roof texture and/or roof type based on scale and/or shape of the footprint.
  • the method may comprise determining a wall height and/or a roof height and/or a roof texture and/or roof height and/or roof type associated with other buildings, which may be determined to be in the vicinity of the building and determining the wall height and/or a roof height and/or a roof texture and/or roof type accordingly.
  • the method may comprise determining footprints from a plan or aerial view.
  • the method may comprise determining a location, which may comprise a user's current location or an input location, which may be a user input location.
  • the method may comprise retrieving and/or determining footprint data associated with the location .
  • the method may com prise generati ng a field of view a round the determined or input location using the footprint data associated with the location.
  • the field of view may comprise one or more 3D models.
  • the method may be a method for operating an apparatus as set forth above in relation to the first aspect and/or a system as set forth in relation to the second aspect and/or a server as set forth in relation to the third aspect.
  • a fifth aspect of the present invention is a computer program product and/or an apparatus programmed with a computer program product, the computer program product being adapted to implement an apparatus as set forth above in relation to the first aspect of the invention and/or a system as set forth above in relation to the second and/or third aspects of the invention and/or a method as set forth above in relation to the fourth aspect of the invention.
  • a sixth aspect of the invention is a method of determining roof meta data, comprising determining a scale and/or shape of a footprint, providing a database of typical footprint scale and/or shape data, matching the scale and/or shape of the footprint with the typical scale and/or footprint data and providing roof meta data associated with the matched scale and/or shape.
  • a seventh aspect of the invention is a method of determining roof meta data, comprising identifying at least one neighbouring building within a predefined distance of the roof, retrieving roof meta data associated with the neighbouring building and determining the roof meta data based on the roof meta data associated with the neighbouring building.
  • Figure 1 is a schematic i ll ustration of an exemplary part of a Global Positioning System (GPS) usable by a navigation device;
  • GPS Global Positioning System
  • Figure 2 is a schematic diagram of a communications system for communication between a navigation device and a server;
  • Figure 3 is a schematic illustration of electronic components of the navigation device of Figure 2 or any other suitable navigation device;
  • Figure 4 is a schematic diagram of an arrangement of mounting and/or docking a navigation device
  • Figure 5 is a schematic representation of an architectural stack employed by the navigation device of Figure 3;
  • Figure 6 is a schematic illustration of entities supported by a processor of the navigation device of Figure 3;
  • Figures 7(a) to 7(c) are illustrations of examples of building formats representable by footprints
  • Figure 8 is a schematic diagram of a footprint for use by the device of Figure
  • Figure 9(a) is a schematic diagram of a footprint showing bisectors and intersections between bisectors generated from the footprint of Figure 8 by the device of Figure 3;
  • Figure 9(b) is a schematic diagram a footprint showing a polygons and lines generated from the footprint of Figure 8 by the device of Figure 3 during a first iteration;
  • Figure 9(c) is a schematic diagram of a footprint showing bisectors and lines generated from the footprint of Figure 8 by the device of Figure 3 during a second iteration;
  • Figures 10(a) to 10(d) shows schematic diagrams of examples of roof types
  • Figure 1 1 shows a diagram of a plan view image
  • Figure 12 shows a 3D model image generated using the device of Figure 3.
  • mapping apparatus as described herein could cover any kind of image generation system that can map data to image features, for example image features of individual buildings, as well as apparatus that can produce images representative of navigable maps
  • a navigation device is intended to include (without limitation) any type of route planning, mapping and navigation device, irrespective of whether that device is embodied as a PN D, a vehicle such as an automobile, or indeed a computing resource, for example a portable personal computer (PC), a mobile telephone or a Personal Digital Assistant (PDA) executing viewing, mapping, route planning and/or navigation software.
  • PC personal computer
  • PDA Personal Digital Assistant
  • the Global Positioning System (GPS) of Figure 1 and the like are used for a variety of purposes.
  • the GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users.
  • NAVSTAR the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be d eterm i n ed , with on ly two sig na ls usi n g oth er trian gu lati on tech n iq u es).
  • the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal allows the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system 100 comprises a plurality of satellites 102 orbiting about the earth 104.
  • a GPS receiver 106 receives spread spectrum GPS satellite data signals 108 from a number of the plurality of satellites 102.
  • the spread spectrum data signals 108 are continuously transmitted from each satellite 102, the spread spectrum data signals 108 transmitted each comprise a data stream including information identifying a particular satellite 102 from which the data stream originates.
  • the GPS receiver 106 generally requires spread spectrum data signals 1 08 from at least three satellites 1 02 in order to be able to calcu late a two- dimensional position. Receipt of a fourth spread spectrum data signal enables the GPS receiver 106 to calculate, using a known technique, a three-dimensional position.
  • a navigation device 200 comprising or coupled to the GPS receiver device 106, is capable of establishing a data session, if required, with network hardware of a "mobile" or telecommunications network via a mobile device (not shown), for example a mobile telephone, PDA, and/or any device with mobile telephone technology, in order to establish a digital connection, for example a digital connection via known Bluetooth technology.
  • a mobile device for example a mobile telephone, PDA, and/or any device with mobile telephone technology
  • the mobile device can establish a network connection (through the Internet for example) with a server 150.
  • a "mobile” network connection can be established between the navigation device 200 (which can be, and often times is, mobile as it travels alone and/or in a vehicle) and the server 150 to provide a "realtime” or at least very “up to date” gateway for information.
  • the establishing of the network connection between the mobile device (via a service provider) and another device such as the server 150, using the Internet for example, can be done in a known manner.
  • any number of appropriate data communications protocols can be employed, for example the TCP/IP layered protocol.
  • the mobile device can utilize any number of communication standards such as CDMA2000, GSM, IEEE 802.1 1 a/b/c/g/n, etc.
  • the internet connection may be utilised, which can be achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example.
  • the navigation device 200 may, of course, include its own mobile telephone technology within the navigation device 200 itself (including an antenna for example, or optionally using the internal antenna of the navigation device 200).
  • the mobile phone technology within the navigation device 200 can include internal components, and/or can include an insertable card (e.g. Subscriber Identity Module (SIM) card), complete with necessary mobile phone technology and/or an antenna for example.
  • SIM Subscriber Identity Module
  • mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 150, via the Internet for example, in a manner similar to that of any mobile device.
  • a Bluetooth enabled navigation device may be used to work correctly with the ever cha ngi ng spectru m of mobi le phone models , manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example. The data stored for this information can be updated.
  • the navigation device 200 is depicted as being in communication with the server 1 50 via a generic commu n ications chan nel 1 52 that can be implemented by any of a number of different arrangements.
  • the communication channel 152 generically represents the propagating medium or path that connects the navigation device 200 and the server 150.
  • the server 150 and the navigation device 200 can communicate when a connection via the communications channel 152 is established between the server 150 and the navigation device 200 (noting that such a connection can be a data connection via a mobile device, a direct connection via personal computer via the internet, etc.).
  • the communication channel 152 is not limited to a particular communication technology. Additionally, the communication channel 152 is not limited to a single com m u n ication tech n ology; that is, the ch an n el 1 52 m ay i n cl ude several communication links that use a variety of technology. For example, the communication channel 152 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 152 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-freq uency (RF) waves , the atmosphere, free space, etc. Furthermore, the communication channel 152 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • RF radio-freq uency
  • the communication channel 152 includes telephone and computer networks. Furthermore, the communication channel 152 may be capable of accommodating wireless communication, for example, infrared communications, radio frequency communications, such as microwave frequency com m u n i cations , etc. Add itiona l ly, the com m u n ication cha n n el 1 52 ca n accommodate satellite communication.
  • the communication signals transmitted through the communication channel 152 include, but are not limited to, signals as may be required or desired for given communication technology.
  • the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc.
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile Communications
  • Both digital and analogue signals can be transmitted through the communication channel 152.
  • These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
  • the server 150 includes, in addition to other components which may not be illustrated, a processor 154 operatively connected to a memory 156 and further operatively connected, via a wired or wireless connection 158, to a mass data storage device 160.
  • the mass storage device 160 contains a store of navigation data and map information, and can again be a separate device from the server 150 or can be incorporated into the server 150.
  • the processor 154 is further operatively connected to transmitter 162 and receiver 164, to transmit and receive information to and from navigation device 200 via communications channel 152.
  • the signals sent and received may include data, communication, and/or other propagated signals.
  • the transmitter 162 and receiver 164 may be selected or designed according to the com m u n ications req u i rem ent and com m u n ication tech n ology u sed i n the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 162 and receiver 164 may be combined into a single transceiver.
  • the navigation device 200 can be arranged to communicate with the server 150 through communications channel 152, using transmitter 166 and receiver 168 to send and receive signals and/or data through the communications channel 152, noting that these devices can further be used to communicate with devices other than server 150.
  • the transmitter 166 and receiver 168 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 1 66 and receiver 1 68 may be combined into a single transceiver as described above in relation to Figure 2.
  • the navigation device 200 comprises other hardware and/or functional parts, which will be described later herein in further detail.
  • Software stored in server memory 156 provides instructions for the processor 154 and allows the server 150 to provide services to the navigation device 200.
  • One service provided by the server 150 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 160 to the navigation device 200.
  • Another service that can be provided by the server 1 50 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
  • the server 150 constitutes a remote source of data accessible by the navigation device 200 via a wireless channel.
  • the server 150 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the server 150 may include a personal computer such as a desktop or laptop computer, and the communication channel 152 may be a cable connected between the personal computer and the navigation device 200.
  • a personal computer may be connected between the navigation device 200 and the server 150 to establish an internet connection between the server 150 and the navigation device 200.
  • the navigation device 200 may be provided with information from the server
  • the processor 154 in the server 150 may be used to handle the bulk of processing needs, however, a processor (not shown in Figure 2) of the navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 150.
  • the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • the navigation device 200 is located within a housing (not shown).
  • the navigation device 200 includes a processing resource comprising, for example, the processor 202 mentioned above, the processor 202 being coupled to an input device 204 and a display device, for example a display screen 206.
  • a processing resource comprising, for example, the processor 202 mentioned above, the processor 202 being coupled to an input device 204 and a display device, for example a display screen 206.
  • the input device 204 represents any number of input devices, including a keyboard device, voice input device, touch panel and/or any other known input device utilised to input information.
  • the display screen 206 can include any type of display screen such as a Liquid Crystal Display (LCD), for example.
  • LCD Liquid Crystal Display
  • one aspect of the input device 204, the touch panel, and the display screen 206 are integrated so as to provide an integrated input and display device, including a touchpad or touchscreen input 250 ( Figure 4) to enable both input of information (via direct input, menu selection, etc.) and display of information through the touch panel screen so that a user need only touch a portion of the display screen 206 to select one of a plurality of display choices or to activate one of a plurality of virtual or "soft" buttons.
  • the processor 202 supports a Graphical User Interface (GU I) that operates in conjunction with the touchscreen.
  • GUI Graphical User Interface
  • the processor 202 is operatively connected to and capable of receiving input information from input device 204 via a connection 210, and operatively connected to at least one of the display screen 206 and the output device 208, via respective output connections 212, to output information thereto.
  • the navigation device 200 may include an output device 208, for example an audible output device (e.g. a loudspeaker).
  • an audible output device e.g. a loudspeaker
  • input device 204 can include a microphone and software for receiving input voice commands as well.
  • the navigation device 200 can also include any additional input device 204 and/or any additional output device, such as audio input/output devices for example.
  • Th e p rocess or 202 i s o perative l y connected to memory 214 via connection 216 and is further adapted to receive/send information from/to input/output (I/O) ports 218 via connection 220, wherein the I/O port 218 is connectible to an I/O device 222 external to the navigation device 200.
  • the external I/O device 222 may include, but is not limited to an external listening device, such as an earpiece for example.
  • connection to I/O device 222 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an earpiece or headphones, and/or for connection to a mobile telephone for example, wherein the mobile telephone connection can be used to establish a data connection between the navigation device 200 and the Internet or any other network for example, and/or to establish a connection to a server via the Internet or some other network for example.
  • any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an earpiece or headphones, and/or for connection to a mobile telephone for example
  • the mobile telephone connection can be used to establish a data connection between the navigation device 200 and the Internet or any other network for example, and/or to establish a connection to a server via the Internet or some other network for example.
  • Figure 3 further illustrates an operative connection between the processor 202 and an antenna/receiver 224 via connection 226, wherein the antenna/receiver 224 can be a GPS antenna/receiver for example.
  • the antenna and receiver designated by reference numeral 224 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
  • the electronic components shown in Figure 3 are powered by one or more power sources (not shown) in a conventional manner.
  • different configurations of the components shown in Figure 3 are contemplated .
  • the components shown in Figure 3 may be in communication with one another via wired and/or wireless connections and the like.
  • the navigation device 200 described herein can be a portable or handheld navigation device 200.
  • the portable or handheld navigation device 200 of Figure 3 can be con nected or "docked" in a known manner to a vehicle such as a bicycle, a motorbike, a car or a boat for example. Such a navigation device 200 is then removable from the docked location for portable or handheld navigation use.
  • the navigation device 200 may be a unit that includes the integrated input and display device 206 and the other components of Figure 2 (including, but not limited to, the internal GPS receiver 224, the microprocessor 202, a power supply (not shown), memory systems 214, etc.).
  • the navigation device 200 may sit on an arm 252, which itself may be secured to a vehicle dashboard/window/etc. using a suction cup 254.
  • This arm 252 is one example of a docking station to which the navigation device 200 can be docked.
  • the navigation device 200 can be docked or otherwise connected to the arm 252 of the docking station by snap connecting the navigation device 200 to the arm 252 for example.
  • the navigation device 200 may then be rotatable on the arm 252.
  • a button (not shown) on the navigation device 200 may be pressed , for example.
  • Other equally suitable arrangements for coupling and decoupling the navigation device 200 to a docking station are well known to persons of ordinary skill in the art.
  • the processor 202 and memory 214 cooperate to support a BIOS (Basic Input/Output System) 282 that functions as an interface between functional hardware components 280 of the navigation device 200 and the software executed by the device.
  • BIOS Basic Input/Output System
  • the processor 202 then loads an operating system 284 from the memory 214, which provides an environment in which application software 286 (implementing some or all of the above described route planning and navigation functionality) can ru n .
  • the application software 286 provides an operational environment including the GUI that supports core functions of the navigation device, for example map viewing, route planning, navigation functions and any other functions associated therewith.
  • part of the application software 286 comprises a view generation module 288.
  • the view generation module 288 supported by the processor 202 comprises a map data processor 290 capable of communicating with a view generation engine 292.
  • the map data processor 290 is capable of accessing the memory 214 in order to access map data 293, the map data 293 comprising at least one footprint 294 and meta data including wall height data 296 and roof data 298, including roof type, height and texture meta data.
  • the view generation module 288 is operable to generate 3D models of buildings using the footprint 294 stored in the memory 214 and the metadata 296, 298.
  • the footprint 294 represents buildings having roofs of various forms, for example a building 300a having a four sided roof with a convex footprint, as shown in Figure 7(a), a building 300b having a four sided roof with a concave footprint, as shown in Figure 7(b), or a building 300c having a four sided roof with holes, for example representing a courtyard, as shown in Figure 7(c).
  • the footprint data comprises an image or geometrical data that generally comprises or defines a series of component lines forming a polygon that is representative of an external perimeter, border or outline of a structure such as a building 300 when viewed from above, i.e. in plan view.
  • An example of a suitable footprint 294 is shown in Figure 8.
  • the view generation module 288 is adapted to divide each footprint into segments, by matching the shape of the footprint to simple pre-defined shapes, such as rectangles and squares.
  • the view generation module 288 is arranged to generate walls in 3 dimensions for each segment of the building 300 represented by the footprint 294 by generating a 3D object for each segment by extrapolating the perimeter of the footprint perpendicularly by a distance corresponding to the wall height obtained from the wall height data 296 stored in the memory 214.
  • the walls of the building 300 are determined to correspond to the line of each segment that forms at least a portion of the perimeter of the building footprint 294.
  • the view generation module 288 is operable to generate a roof model for each building 300 from the footprint data 294 and the roof meta data 298.
  • the view generation module 288 is operable to retrieve the roof type from the roof metadata 298 and perform a roof determination process based on the roof type.
  • the processor 202 is operable to determine the roof meta data 298, for example, by associating scales and/or shapes of the foot print with typical metadata for that shape and/or scale and/or by determining roof type data for the roofs of other buildings in the vicinity of the building 300 for which the model is being determined.
  • the view generation module 288 is operable to identify a roof type from the roof meta data 298 and to generate a roof model in a manner that is dependent on the roof type.
  • the operation of the view generation module 288 to generate a 4 sided roof for a concave footprint is described below. However, it will be appreciated that corresponding processes may be used to generate other roof types such as two sided roofs, pyramidal roofs and the like.
  • the map data processor 290 is operable to retrieve footprint data 294 and any roof meta data 296, 298 associated with a particular building from the memory 214 and communicate this to the view generation module 288.
  • the view generation module 288 processes the footprint 294 to identify any vertices 302 in the footprint, i.e. points associated with corners in the building 300 where the perimeter of the building 300 as indicated by the footprint 294 changes direction, as shown in Figure 9(a). For each vertex 302 of the footprint 294, the view generation module 288 calculates an associated bisector 304. Each bisector 304 takes the form of a line passing through the associated vertex 302 and extending internally of the footprint 294, wherein the angles between the bisector 304 and each of the sections of the footprint 294 that form the vertex 302 are equal.
  • the view generation module 288 is operable to determine any intersections 306 between bisectors 304. In this regard, for each bisector 304, only the closest intersection 306 to the vertex 302 associated with that bisector 304 is considered or determined. Furthermore, any intersections 306 lying outwith the footprint 298 are not considered or determined.
  • the view generation module 288 is then operable to determine the closest intersection 306a to the perimeter of the footprint 294 and calculate a shift amount 307 corresponding to the smallest distance between the closest intersection 306a and the perimeter of the footprint 294.
  • the view generation module 288 is operable to generate a series of lines 308 corresponding to component lines that form the footprint, as shown in Figure 9(b).
  • the generated lines 308 are generated by forming lines parallel to component lines 310 making up the footprint 294 and located internally of the footprint 294 by a distance corresponding to the shift amount 307.
  • the generated lines 308 are sized such that the generated lines 308 approach no closer than the shift amount 307 to the footprint 294. It will be appreciated that, depending on the geometry of the footprint 294, these generated lines may form polygons 312, lines 314 or points.
  • the view generation module 288 is operable to repeat the above processes with respect to each polygon 312, i .e. locating the vertices 312a of the polygon 312, calculating the bisectors 312b associated with each of the polygon's vertices 312a, calculating the intersections 312c of bisectors 312b and generating further polygons, lines 316 or points as appropriate based on the minimum distance between the polygon 312 and the closest intersection 312c, as shown in Figure 9(c). This process is repeated in an iterative fashion until no new polygons 312 are produced, and all the polygons 312 have been degenerated into lines 314, 316 or points.
  • any polygons 312 that were generated are removed, leaving only the generated lines 314, 316, points (i.e. corresponding to roof apexes) and bisectors 304, 312b leading from an associated vertex 302 of the original footprint 294 to the first line 314, 316 or point it meets, which results in a final roof plan, as shown in Figure 9(d).
  • each node i.e. line 314, 316 or point
  • the relative height of each node, i.e. line 314, 316 or point, generated at each iteration is determined, and may be assigned an associated height based on a fraction of the maximum height retrieved from the roof meta data 298. For example, the highest nodes 316 (i.e. those generated in the final iteration) are assigned a relative height of 1 (corresponding to the height of the walls plus the maximum roof height) and the height at the vertices 302 of the original footprint 294 is assigned a relative height of zero (corresponding to the wall height). Nodes 314 generated in intervening iterations are assigned a relative height between 0 and 1 depending on which iteration it was generated at.
  • the relative height of each point on a bisector 304 is determined by extrapolating between the relative heights at each end of the bisector 304.
  • the relative heights are used by the view generation module 288, along with roof height metadata 298, to scale the roof and determine the angle of the roof.
  • the total height of the roof can be determined based on the relative heights and type of roof (which can contain angle information).
  • the result of the roof data generation process performed by the view generation module 288 is to provide a series of segments, along with associated roof heights, angles and outline edges that may be used in conjunction with simple 3D objects representing the walls, in order to generate 3D models 320 of buildings 300, such as that shown in Figure 12.
  • this process may be modified for roof configurations other than a 4 sided concave roof.
  • roof configurations other than a 4 sided concave roof.
  • bisectors corresponding to vertices of the building 300 having acute internal angles may be removed and any ends of generated lines 314', 316' that were located at a removed bisector may be extended to the perimeter of the footprint 294.
  • the bisectors 304" all meet at a single point 318 rather than a line 316, 318 or polygon 312.
  • the segmentation process used above to generate walls may also be used in relation to roofs.
  • the building 300 shown in Figures 10(c) and 10(d) has been segmented into two rectangular segments 319a, 319b, and each segment 319a, 319b treated as an independent footprint 294 to generate a four sided and two sided roof respectively.
  • the view generation module 288 is operable to use one or more 3D building models 320 generated to create a field of view image, such as a field of view corresponding to the current location of the device, as determined using the GPS system 224, 226, or corresponding to a location input by a user using the input device 204.
  • the 3D models 320 can be rotated, scaled, moved, and/or modified to account for perspective in accordance with their position relative to the determined or input location in order to create a field of view image.
  • the device 200 may be arranged to store aerial or plan view images 350, such as aerial or satellite photographs or town planning records, and to process the aerial or plan view images 350, for example as shown in Figure 1 1 , in order to determine building footprints 294, using techniques such as edge recognition, and pattern matching.
  • the determined footprints 294 may be used in the same manner as stored footprints 294, as described above
  • Table 1 A comparison of the data requirements for defining a 3D building model using definitions of each vertex location vs using the footprint method for the model as shown in Figure 9(d)
  • the walls of the 3D model have 16 vertices, each vertex requiring three coordinates to define its 3D position, which requires enough memory to store 48 double precision floating point numbers.
  • the roof in the 3D model has 12 vertices, each vertex requiring three coordinates to define its 3D location. This requires enough memory to store 36 double precision floating point numbers. Therefore, to store a 3D model of the building shown in Figure 9(d) in terms of the location of each of its vertices requires enough memory to store 84 floating point numbers.
  • the complexity is also greatly increased , as 8 polygons (surfaces) are required to form the walls and 7 polygons (surfaces) are required to form the roof.
  • Information relating to each polygon, such as which nodes are associated with which polygon, may also be stored. In this case, the memory required to store 3D models would also be increased.
  • the device 200 only needs to process 8 vertices defined in a 2D coordinate space in order to define the walls of the same building 300 plus an additional variable that defines the wall height. This only requires enough memory to store 17 double precision floating point numbers. Furthermore, only the two dimensional coordinates of 8 vertices 302, the height of the roof and the type of roof are required in order to generate a 3D roof image, which requ i res 1 7 dou ble precision floating poi nt n um bers and 1 integer n u m ber (representing the roof type). Therefore, only sufficient memory and bandwidth to store and transmit 34 floating point numbers and 1 integer number is required in order to display the 3D model 320 shown in Figure 9(d).
  • devices such as that described above that store and/or transmit 2D footprint data and associated metadata and are adapted to subsequently convert this into 3D model data, require considerably less memory and bandwidth to provide 3D model images than devices that directly store and transmit 3D model data.
  • the application may be used in relation to a display or mapping system running on a personal computer, laptop, PDA, mobile phone or other device with computational functionality, for example, akin to systems that provide applications such as Google (RTM) maps, Bing (RTM) maps, OVI (RTM) maps or the like.
  • Google RTM
  • RTM Bing
  • RTM OVI
  • the actual device performing the conversion from a 2D footprint image 294 and associated meta data 296, 298 to a 3D model 320 and any communications links required may vary.
  • the processing may optionally be performed on a portable device using 2D footprint and associated meta data stored in a memory 214 of the device.
  • the memory space on such devices is often limited and storing the source data in the form of footprint images and associated meta data and converting the footprint images into 3D model images using the metadata on the device may increase the amount of building models that can be stored on such devices.
  • the image conversion/generation may be preformed on a computational device 200, such as those described above, and the footprint and associated metadata may be stored at a remote server 150 or database.
  • communications bandwidth between the server/remote database and the computational device may be reduced, as the data transmitted is reduced and any storage of model data on the computational device may also be reduced.
  • both the image data and processing may be performed at a remote location, such as a server 150, and transmitted to the device 200.
  • a remote location such as a server 150
  • This arrangement would result in more efficient use of storage 156 at the server 150 than storing the 3D model images directly.
  • the navigation device may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) GPS.
  • the navigation device may utilise using other global navigation satellite systems such as the European Galileo system. Equally, it is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location.
  • Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared.
  • the series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)

Abstract

L'invention porte sur un appareil de navigation et/ou de cartographie, qui comprend : une ressource de traitement (202) fonctionnellement couplée à un magasin de données (214, 293), le magasin de données (214, 293) étant conçu pour stocker au moins une source d'empreinte et la ressource de traitement (202) étant conçue pour obtenir une empreinte (294) à partir de la source d'empreinte, générer un modèle (320) d'une structure sur la base de l'empreinte (294) et fournir des données d'image concernant le modèle (320).
PCT/EP2010/051493 2010-02-08 2010-02-08 Appareil et procédé de génération d'une vue WO2011095226A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/051493 WO2011095226A1 (fr) 2010-02-08 2010-02-08 Appareil et procédé de génération d'une vue

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/051493 WO2011095226A1 (fr) 2010-02-08 2010-02-08 Appareil et procédé de génération d'une vue

Publications (1)

Publication Number Publication Date
WO2011095226A1 true WO2011095226A1 (fr) 2011-08-11

Family

ID=42990223

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/051493 WO2011095226A1 (fr) 2010-02-08 2010-02-08 Appareil et procédé de génération d'une vue

Country Status (1)

Country Link
WO (1) WO2011095226A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130324098A1 (en) * 2012-06-05 2013-12-05 Patrick S. Piemonte Methods and Apparatus for Determining Environmental Factors to Modify Hardware or System Operation
WO2014161605A1 (fr) * 2013-04-05 2014-10-09 Harman Becker Automotive Systems Gmbh Dispositif de navigation, procédé de fourniture d'une carte électronique et procédé de génération d'une base de données
US9418478B2 (en) 2012-06-05 2016-08-16 Apple Inc. Methods and apparatus for building a three-dimensional model from multiple data sets
US9582932B2 (en) * 2012-06-05 2017-02-28 Apple Inc. Identifying and parameterizing roof types in map data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1189176A2 (fr) * 2000-09-19 2002-03-20 Alpine Electronics, Inc. Méthode d'affichage de carte
FR2901382A1 (fr) * 2006-05-16 2007-11-23 Sagem Defense Securite Procede de transmission d'un flux de donnees cartographiques a un utilisateur et equipements associes

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1189176A2 (fr) * 2000-09-19 2002-03-20 Alpine Electronics, Inc. Méthode d'affichage de carte
FR2901382A1 (fr) * 2006-05-16 2007-11-23 Sagem Defense Securite Procede de transmission d'un flux de donnees cartographiques a un utilisateur et equipements associes

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BRENNER ET AL: "Towards fully automated 3D city model generation", IN AUTOMATIC EXTRACTION OF MAN-MADE OBJECTS FROM AERIAL AND SPACE IMAGES III. 2001,, 1 January 2001 (2001-01-01), pages 1 - 10, XP007915555 *
LAYCOCK R G ET AL: "AUTOMATICALLY GENERATING LARGE URBAN ENVIRONMENTS BASED ON THE FOOTPRINT DATA OF BUILDINGS", PROCEEDINGS 8TH. ACM SYMPOSIUM ON SOLID MODELING AND APPLICATIONS. SM'03. SEATTLE, WA, JUNE 16 - 20, 2003; [PROCEEDINGS OF THE SYMPOSIUM ON SOLID MODELING AND APPLICATIONS], NEW YORK, NY : ACM, US, 16 June 2003 (2003-06-16), pages 346 - 351, XP001233234, ISBN: 978-1-58113-706-4, DOI: DOI:10.1145/781606.781663 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130324098A1 (en) * 2012-06-05 2013-12-05 Patrick S. Piemonte Methods and Apparatus for Determining Environmental Factors to Modify Hardware or System Operation
US9418478B2 (en) 2012-06-05 2016-08-16 Apple Inc. Methods and apparatus for building a three-dimensional model from multiple data sets
US9582932B2 (en) * 2012-06-05 2017-02-28 Apple Inc. Identifying and parameterizing roof types in map data
US9756172B2 (en) * 2012-06-05 2017-09-05 Apple Inc. Methods and apparatus for determining environmental factors to modify hardware or system operation
US10163260B2 (en) 2012-06-05 2018-12-25 Apple, Inc. Methods and apparatus for building a three-dimensional model from multiple data sets
US12002161B2 (en) 2012-06-05 2024-06-04 Apple Inc. Methods and apparatus for building a three-dimensional model from multiple data sets
WO2014161605A1 (fr) * 2013-04-05 2014-10-09 Harman Becker Automotive Systems Gmbh Dispositif de navigation, procédé de fourniture d'une carte électronique et procédé de génération d'une base de données

Similar Documents

Publication Publication Date Title
US9739633B2 (en) Navigation device and method
US9638539B2 (en) Navigation methods and apparatus
US8706403B2 (en) Systems and methods for detecting bifurcations
EP2531816B1 (fr) Stockage de cartes pour systèmes de navigation
US20080228393A1 (en) Navigation device and method
US20110087715A1 (en) Method and apparatus for preparing map data
WO2009156426A1 (fr) Dispositif et procédé de navigation
WO2010040400A1 (fr) Appareil de navigation et procédé permettant d'obtenir des points d'intérêt
WO2010040386A1 (fr) Appareil de navigation et procédé de détermination d’un trajet pour celui-ci
WO2011095226A1 (fr) Appareil et procédé de génération d'une vue
WO2011154050A1 (fr) Dispositif et procédé de navigation comprenant des instructions améliorées contenant une image panoramique d'une scène
GB2492379A (en) Scaling a map displayed on a navigation apparatus in order to maximise the display of a remaining route
WO2010081538A2 (fr) Dispositif et méthode de navigation
WO2009132679A1 (fr) Dispositif et procédé de navigation
WO2010040382A1 (fr) Appareil de navigation et son procédé d'utilisation
TW201131510A (en) Apparatus and method for generating a view
WO2010075876A1 (fr) Système de navigation avec des moyens destinés à indiquer la position latérale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10704919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC DATED 08.11.2012

122 Ep: pct application non-entry in european phase

Ref document number: 10704919

Country of ref document: EP

Kind code of ref document: A1