WO2012089258A1 - Procédés de cartographie et appareil associé - Google Patents

Procédés de cartographie et appareil associé Download PDF

Info

Publication number
WO2012089258A1
WO2012089258A1 PCT/EP2010/070892 EP2010070892W WO2012089258A1 WO 2012089258 A1 WO2012089258 A1 WO 2012089258A1 EP 2010070892 W EP2010070892 W EP 2010070892W WO 2012089258 A1 WO2012089258 A1 WO 2012089258A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
features
measured
view
transformed image
Prior art date
Application number
PCT/EP2010/070892
Other languages
English (en)
Inventor
Radoslaw Chmielewski
Original Assignee
Tele Atlas Polska Sp.Z.O.O
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tele Atlas Polska Sp.Z.O.O filed Critical Tele Atlas Polska Sp.Z.O.O
Priority to PCT/EP2010/070892 priority Critical patent/WO2012089258A1/fr
Publication of WO2012089258A1 publication Critical patent/WO2012089258A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • G01C21/3819Road shape data, e.g. outline of a route
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/3867Geometry of map features, e.g. shape points, polygons or for simplified maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the invention relates to methods for mapping, and associated apparatus and methods.
  • the invention relates to methods for measuring features at a location for use in mapping, such as digital mapping.
  • Maps that are provided digitally, or so-called digital maps can be used remotely, such as over a network (e.g. the Internet), or can be used locally, such as in a portable device (e.g. a portable multimedia or navigation device).
  • a network e.g. the Internet
  • a portable device e.g. a portable multimedia or navigation device
  • location-based signals such as Global Navigation Satellite Signals (e.g. Global Positioning System signals) and a vehicle fitted with a receiver configured to receive such location-based signals
  • location-based signals such as Global Navigation Satellite Signals (e.g. Global Positioning System signals)
  • a vehicle fitted with a receiver configured to receive such location-based signals
  • a map such as a digital map.
  • digital maps can provide artificial representations of particular locations. This is true of when a user is shown a 3D representation, or perspective view of a particular location. Such views can be used in navigation devices, where the direction of travel is shown to a user in a 3D perspective.
  • the more accurately that a location is mapped, or represented by a digital map the easier it is for a user to identify a real-world location when using that digital map.
  • the more effectively and accurately that locations are mapped the more likely it is that users derive benefits from the information provided in a map without, for example, the need to measure or determine features at that location themselves.
  • a method for measuring features at a location According to a first aspect of the invention there is provided a method for measuring features at a location.
  • Such measured features may be for use in mapping.
  • the method may comprise measuring features of a location from a transformed image.
  • the transformed image may have been obtained from one or more images of the location.
  • the transformed image may have been transformed to provide a virtual field of view showing features to be measured at the location.
  • the transformed image may be associated with a full or partial panoramic image of a location, obtained from the one or more images of that location.
  • the transformed image may have been provided by combining the one or more images of a location.
  • the one or more images may have been captured by an imaging device.
  • the one or more images acquired at a location may be associated with one or more cameras.
  • the one or more images may have been acquired at, or around, the same time.
  • the images may have been taken using a vehicle travelling at the locations.
  • the transformed image may be associated with a virtual view point.
  • the virtual view point may have a principal direction of view.
  • the principal direction of view may be towards a feature, such as a surface, to be measured.
  • the principal direction may be normal, or roughly normal, to a feature to be measured.
  • the principal direction may be orthogonal to a feature to be measured.
  • the virtual field of view in the transformed image may be between roughly 130 degrees and 175 degrees.
  • the virtual field of view may be 170 degrees.
  • the virtual field of view may be between roughly 65 and 87.5 degrees, either side of the principal direction of view.
  • the virtual field of view may be between roughly 85 degrees, either side of the principal direction of view.
  • the measured features may be associated with a surface, such as a road or pavement/sidewalk surface.
  • the measured features may be on a surface, such as a road surface.
  • the method may comprise determining distances associated with particular features.
  • the measured features may include the width and/or length of roads, the size of parking bays, or the like.
  • the method may comprising using a reference feature in order to measure features in the transformed image.
  • the reference feature may be in the transformed image.
  • the reference feature may allow for associating a distance with a pixel in the transformed image.
  • the method may comprise transforming a distance in the transformed image into 3D space, and measuring the feature.
  • the method may comprise measuring features in a plurality of transformed images. Each transformed image may be associated with a different location (e.g. a different location along a road).
  • the method may comprise storing measured features for use, such as subsequent use, with a database.
  • the database may comprise one or more digital maps.
  • the method may comprise using the measured features to provide, or supplement, a digital map.
  • the method may comprise storing additional location-based data for use with the measured features.
  • the method may comprise transforming one or more images to provide a transformed image having a virtual field of view showing features to be measured at a location.
  • mapping database comprising measured features of one or more locations, the measured features of the location(s) having been measured from a transformed image obtained from one or more images of the location.
  • the transformed image may have been transformed to provide a virtual field of view showing the measured features at the location (e.g. between 130 and 175 degrees, such as 170 degrees).
  • the transformed image may have been transformed to as to have a principal direction of view towards a feature to be measured (e.g. a feature on or associated with a surface).
  • the principal direction may be orthogonal to a feature to be measured.
  • a digital map stored, or storable, on a computer readable medium or device comprising measured features of one or more locations, the measured features of the location(s) having been measured from a transformed image obtained from one or more images of the location.
  • the transformed image may have been transformed to provide a virtual field of view showing the measured features at the location (e.g. between 130 and 175 degrees, such as 170 degrees).
  • the transformed image may have been transformed to as to have a principal direction of view towards a feature to be measured (e.g. a feature on or associated with a surface).
  • the principal direction may be orthogonal to a feature to be measured.
  • a device such as a portable media and/or navigation device, comprising and/or configured for use with a digital map according to third aspect.
  • a method for measuring features at a location comprising:
  • the transformed image having been obtained from one or more images of the location and having been transformed to provide a principal direction of view towards a feature to be measured.
  • a method of providing for measuring features of a location for use in mapping comprising:
  • the method may comprise measuring one or more features in the transformed image.
  • transforming one or more images of a location to provide a transformed image having a principal direction of view towards a feature, such as roads, to be measured at the location.
  • the method may comprise measuring one or more features in the transformed image.
  • a database or computer readable medium, having the one or more transformed images according to the eighth aspect.
  • a computer program product comprising a computer program, configured to perform the method of any of the features of the above method aspects.
  • a method for measuring a means for characterising (e.g. surfaces) a location there is provided a method for measuring a means for characterising (e.g. surfaces) a location.
  • Such means for characterising a location may be for use in means for mapping (e.g. digital maps).
  • the method may comprise measuring means for characterising a location from a means for viewing a location (e.g. an image of a location).
  • the means for viewing may have been obtained from one or more images of the location.
  • the means for viewing may have been transformed to provide a virtual field of view of means for characterising a location to be measured.
  • the means for viewing may have been transformed to provide a principal direction of view towards a means for characterising a location.
  • apparatus for measuring features at a location Such measured features may be for use in mapping.
  • the apparatus may be configured to measure features of a location using a transformed image.
  • the apparatus may be configured to use a transformed image having been obtained from one or more images of a location.
  • the apparatus may be configured to use a transformed image having been transformed to provide a virtual field of view showing features to be measured at the location.
  • the apparatus may be configured to use a transformed image having a principal direction of view towards a feature, such as roads, to be measured at the location.
  • the apparatus may be configured to transform one or more images to a transformed image.
  • means for measuring means for characterising a location Such measured means for characterising a location may be useful with means for mapping.
  • the means for measuring may be configured to measure means for characterising a location using a means for viewing a location.
  • the means for measuring may be configured to use a means for viewing a location having been obtained from one or more images of a location.
  • the means for measuring may be configured to use a means for viewing a location having been transformed to provide a virtual field of view showing means for characterising a location to be measured at the location.
  • the means for measuring may be configured to use a means for viewing a location having a principal direction of view towards a means for characterising a location, such as roads, to be measured at the location.
  • the means for measuring may be configured to transform one or more images to a means for viewing a location.
  • the present invention includes one or more corresponding aspects, embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation.
  • any optional features recited in respect of the first aspect may be equally applicable in relation to the second, third, fourth aspects, etc. without the need to list all various possibilities are permutations here.
  • one or more embodiments/aspects may be useful in mapping features at locations and/or providing digital maps.
  • Figure 1 a shows an exemplary representation of a portion of a digital map, and Figure 1 b shows alternative exemplary representation of the digital map, presented in a perspective view;
  • Figure 2 shows an example of a vehicle 50 and imaging device used for mapping;
  • Figure 3a and 3b show images from the imaging device of Figure 2, and Figure 3c shows an exemplary panoramic image of a location;
  • Figure 4a shows a sphere in 3D space for use with the images of Figure 3;
  • Figure 4b shows a transformed image for measuring features;
  • Figure 4c shows an exemplary application window for displaying a transformed image;
  • Figure 5 shows an exemplary manner of measuring features in the transformed image of Figure 4b.
  • Figure 6a and 6b show example of further features to be measured using transformed images
  • Figure 7a, 7b and 7c show corresponding transformed images having different virtual fields of view
  • Figures 8a shows an exemplary field of view from a camera associated with the imaging device and Figure 8b shows the effective field of view of that camera when using a virtual field of view;
  • Figure 9a and 9b show further examples of transformed images from which features can be measured.
  • Figure 10 show a flow diagram of an embodiment of a method for mapping. Description of Specific Embodiments
  • Figure 1 a shows an exemplary representation of a portion of a digital map 10, which may be shown to a user, for example during navigation (e.g. when using a portable navigation device).
  • a user arrow 20 indicates the location of the user as well as the direction of travel, or bearing, using location-based signals in a known manner.
  • the map 10 shows a series of segments 30 which, in this example, are representative of roads.
  • a navigation arrow 40 indicates to the user the next turning to make (e.g. based on a predetermined route).
  • Figure 1 b shows an alternative exemplary representation of a portion of the digital map 10, in which the user is presented with a perspective view.
  • the view can be considered to be representative of a view that a user might see from the user's present location and direction of travel.
  • the view provided in Figure 1 b is that which the user would see in front of them, when at the location/direction indicted by the user arrow 20 in Figure 1 a.
  • a navigation arrow 45 is presented to the user.
  • a segment 35 is shown as a road extending out in front of the user, as they would observe looking forward from a vehicle.
  • views such as those provided in Figure 1 b, which more realistically represent the view of a user can be helpful, and less confusing to follow, during navigation. It can also be easier for a user to identify particular features on or associated with the road by looking at the digital map 10 in that perspective view. Additionally, such views allow the map 10 to present more information regarding a particular road, or the surroundings of a road. However, segments 30, 35 representative of roads are generally provided without much - if any - additional information. Therefore, for example, it might be helpful to be able to show accurately information about the width of the road, or length of a parking bay, or crawler lane, or the exact manner in which two or more roads merge, etc. Such additional accurate information may help improve a user's experience of the digital map 10 as well as assist with navigation, for example, at night or during poor weather conditions.
  • FIG. 2 shows an example of a vehicle 50 used for mapping.
  • the vehicle 50 comprises location-based signal receivers (not shown), such as global navigation satellite system receivers (e.g. GPS, Galileo, GLONASS, or the like), cellular receivers, wireless network receivers, or the like.
  • location-based signal receivers such as global navigation satellite system receivers (e.g. GPS, Galileo, GLONASS, or the like), cellular receivers, wireless network receivers, or the like.
  • the vehicle 50 additionally comprises an imaging device 60, configured to image a location.
  • the imaging device 60 comprises a plurality of cameras 65 configured to capture images of a location.
  • the cameras 65 are essentially displaced around (e.g. circumferentially displaced) the imaging device 60. This allows for capturing images in different directions at particular locations.
  • An example of such an imaging device 60 is sold under the trade name LadyBug®2, which is provided by Point Grey, 12051 Riverside Way, Richmond, British Columbia, V6W 1 K7, Canada.
  • the height, h, of imaging device 60 above the surface 100 can be determined, or approximated.
  • An exemplary height is 3 metres, or so.
  • the height, h may be different due to use of a different sized vehicle 50.
  • the imaging device 60 captures a plurality of images at particular intervals during travel.
  • Figure 3a and 3b show exemplary images 63, 67 taken from particular cameras 65 of the imaging device 60.
  • the plurality of images can be combined to show a full panoramic image 70 at that location, as is shown in Figure 3c.
  • the panoramic image 70 shown in Figure 3c has been provided from a plurality of images 75a-75e, each taken from different camera 65.
  • the panoramic image 70 can be considered to be a full panoramic image, which shows a 360 degree representation of that particular location.
  • the panoramic image may be a partial panoramic image, showing less than 360 degrees (e.g. 270 degrees, 180 degrees, 90 degrees, or angles therebetween).
  • the panoramic image 70 provides some information about the location, it is difficult to measure accurately features at that locations using the panoramic image 70. For example, features of the surface 100, such as road or lane width, is difficult to determine from any of the image 75a-75e, or from the combined panoramic image 70.
  • the panoramic image 70 it is possible to transform the panoramic image 70 to have a virtual field of view and/or principal direction of view showing one or more features, such as the road or lane width, such that those features can be easily measured in a transformed image.
  • a virtual field of view can be considered to be the view taken from a virtual camera, or virtual view point 105 (as will be described in relation to Figure 4a).
  • By creating a transformed image from the one or more images 75a-75e of a location e.g. from the panoramic image 70), it has been discovered that it is possible to measure features, such as road or lane widths, at that location. The accurate measurement of such features is useful in digital maps, such as that shown in Figure 1 b.
  • a virtual view point 105 can be created by using a perspective projection matrix and a view matrix, in a known manner.
  • An example of a perspective projection matrix in 3D environment is described here: http://msdn.microsoft.com/en- us/library/bb147302%28VS.85%29.aspx
  • an example of a view matrix in 3D environment is described here: http://msdn.microsoft.com/en- us/library/bb206342%28VS.85%29.aspx, however any suitable perspective projection matrix or view matrix may be used.
  • a transformation matrix can then be provided by multiplying the view matrix and the projections matrix.
  • the panoramic image 70 can be projection mapped onto a sphere 150 so as to create, from a user's point of view, a transformed image 200 shown in Figure 4b, as will be described.
  • Figure 4a shows the sphere 150 created in 3D space. Cartesian co-ordinates (x, y, z) have been used below for explanation.
  • the panoramic image 70 can be projection mapped onto the sphere 150.
  • the height, h, of the imaging device 60 from the surface 100 e.g. road surface
  • the virtual view point 105 is created at the centre of the sphere 150 (i.e. at 0, 0, 0).
  • a virtual field of view of between roughly 130 degrees and 175 degrees is helpful in some examples, for example, when used to measure features of a surface such as a road (e.g. road widths or the like).
  • a virtual field of view of roughly 170 degrees is shown.
  • the principal direction of the virtual view point 100 is taken so as to be essentially normal, or orthogonal, to that surface 100 on which the features are to be measured. This is represented by vector (0, 0, -1 ) shown in Figure 4a.
  • a virtual field of view of 170 degrees can therefore be considered to show, essentially, 85 degrees either side of the principal direction of virtual view point 105.
  • this could be considered to provide a virtual field of view of 360 degrees.
  • the horizon between the sky and ground was exactly halfway on the panoramic image 70, then a virtual field of view of 180 degrees could show in one example exclusively sky, or in another example exclusively ground.
  • the principal direction of virtual view point 105 is towards the surface 100, only ground would be shown in that example.
  • the aspect ratio, or ratio of height to width pixels of the resultant transformed image 200 to be presented can be given.
  • Figure 4b shows a representation of a transformed image 200, which is taken from the virtual view point 105 in the direction towards features to be measured (e.g. features on the surface 100).
  • a perimeter 1 15 of the transformed image 200 is shown which corresponds to a perimeter 1 10 of the virtual field of view shown in Figure 4a.
  • An exemplary aspect ratio may be a width having 698 pixels and a height having 588 pixels, which would provide an aspect ratio of 1 .18707483.
  • Figure 4c shows an exemplary application window having that aspect ratio for use in displaying transformed images 200.
  • a near field plane in the principal direction of view i.e. z-direction
  • a far field plane in the principal direction can be provided.
  • a near field plane is established such that it is less that the radius of the sphere 150.
  • a far field plane is established such that it is greater than the radius of the sphere 150.
  • a near field may be roughly 0.1 m
  • a far field plane may be roughly 100 m.
  • the position vector, represented in co-ordinates x, y, z, of the virtual view point 105 can be considered to be (0, 0, 0), such that the view point is in the centre of the panoramic image 70.
  • the direction vector of the virtual view point can be considered to (0, 0, -1 ), such that the virtual view point 105 is directed towards features to be measured (e.g. facing downwards).
  • a so-called up vector of the virtual point of view can be considered to be (0, 1 , 0) so that when the virtual view point is facing down, the up vector can be put only in XY axis.
  • Figure 4b also shows the transformed image 200 having a virtual field of view showing one or more features to be measured.
  • the transformed image 200 has been provided from one or more images of a location, and shows a surface 310, which in this example is a road 310.
  • the road 310 has a left lane 310a and a right lane 310b. Portions of the vehicle 50 are also shown in the transformed image 200.
  • a region 350 in which no data exists is also shown, where no image was captured from the imaging device 60 (i.e. at the position of the imaging device 60 on the roof of the vehicle 50).
  • the virtual view point appears to a user to be looking down on the vehicle 50 and surface 310 from a position many metres (e.g. 20 metres) above the vehicle 50.
  • this is due to the images having been transformed to provide the transformed image 200 in the manner described above.
  • the images used to create the transformed image 200 have been captured from only 3 meters, or so, above the surface 310.
  • Figure 5 shows in 3D space, the virtual view point 105 associated with the transformed image 300.
  • vectors, V1 and V2 pass from the virtual view point (0,0,0) and intersect with the surface 310 at P1 and P2.
  • the vectors can be determined in 3D space by using the co-ordinates of P1 and P2 taken from the transformed image of Figure 4b, and the inverted transformation matrix.
  • the intersection of these vectors in 3D space at the road surface 310 can then be determined as the intersection between a line and a plane in a known manner.
  • the co-ordinates P1 and P2 can be un-projected from the transformed image 200 to points in 3D space.
  • these two points P1 and P2 in 3D space, it is possible to determine the distance between them (e.g. using trigonometric and/or Pythagorean relationships, along with, in some cases, the height h).
  • features, such as road widths, etc., in the transformed image 200 can be measured and complied for use with digital maps.
  • Figure 6a shows an example of a transformed image 400 in which a parking bay 450 is being measured.
  • Figure 6b shows an example of a transformed image 500 in which a road 550 width and length is being measured.
  • a virtual field of view of roughly 170 degrees has been described, it will be appreciated that in further embodiments, a different virtual field of view may be used.
  • Figure 7a shows a transformed image 600a having a virtual field of view of roughly 170 degrees.
  • Figure 7b shows a transformed image 600b, taken from the same panoramic image, having a virtual field of view of roughly 130 degrees
  • Figure 7c shows a transformed image 600c, taken from the same panoramic image, having a virtual field of view of roughly 175 degrees
  • Figures 8a shows an exemplary field of view 710 that might be taken from a camera 65 associated with the imaging device 60, described in relation to Figure 2.
  • Figure 8b shows the effective field of view 720 of that camera when using the virtual field of view described above.
  • Figure 9a and 9b shows further examples of transformed images 800a, 800b from which measurements can be made.
  • Figure 10 shows an exemplary flow diagram 1000 of an exemplary method. The method comprises taking images, and transforming those images, and measuring features at a location for use in mapping.
  • One or images are captured 1 100 at a location, such as those shown in Figures 3a and 3b. Of course, in some examples only one image may be captured.
  • the one or more images can then be provided 1200 as a panoramic image (e.g. full or partial panoramic image).
  • a transformed image is provided 1300 having a virtual field of view showing features to be measured at the location and/or having a principal direction of view towards features to be measured.
  • Features, such as surfaces (e.g. road widths, etc.) can be measured 1400 in the transformed image.
  • Measured features can be compiled or used 1500 in maps, such as digital map, which may be existing digital maps.
  • the measurement of these features can then be used to create and/or supplement digital maps. These may be provided with a database of such maps, or the like, and/or may be for use in portable electronic devices.
  • the resultant digital maps, comprising data on the measured features are accessible remotely, for example, over a network (e.g. cellular network, the Internet, etc.).
  • a feature such as a road width, length, parking bay size, etc.
  • this can be used to accurately represent segments 35, such as those shown in Figure 1 b.
  • features measured at locations are used to supplement existing digital maps.
  • the vehicle 50 described in relation to Figure 2 was able to receive location-based signals, it will be appreciated that that need not always be the case.
  • the vehicle 50 may comprise the imaging device 60 without a receiver for location-based signals.
  • features measured in the above described manner may be used with existing location data.
  • the above described embodiments provide mapping principally in relation to maps for use with navigation devices, it will be appreciated that such mapping may also be used for recreation or business purposes.
  • mapping may also be used for recreation or business purposes.
  • the same methods and apparatus may be used to measure different features at locations.
  • digital maps are used additionally at locations, such as golf courses. It can be helpful to be able to provide a digital map that accurately represents features on that course, such as bunker sizes, fairway widths, etc. A skilled reader will readily appreciate that the same methods may be applied to measuring different features at further locations.
  • Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared.
  • the series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Cette invention se rapporte à des procédés et à un appareil destinés à être utilisés en cartographie, par exemple, à des procédés destinés à mesurer des caractéristiques, telles que des largeurs de route, etc. à des endroits particuliers. Dans certains exemples, il est décrit la mesure de caractéristiques d'un endroit à partir d'une image transformée de cet endroit, grâce à quoi l'image transformée a été obtenue à partir d'une ou de plusieurs images (par exemple à partir d'une image panoramique de l'endroit) et a été transformée de façon à fournir un champ de vision virtuel qui présente les caractéristiques à mesurer à l'endroit. L'image transformée peut être associée à un point de vue virtuel qui présente une direction de visée principale qui est orthogonale à une caractéristique, telle qu'une surface, à mesurer. Il est également décrit des cartes numériques associées, des bases de données, etc.
PCT/EP2010/070892 2010-12-29 2010-12-29 Procédés de cartographie et appareil associé WO2012089258A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/070892 WO2012089258A1 (fr) 2010-12-29 2010-12-29 Procédés de cartographie et appareil associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/070892 WO2012089258A1 (fr) 2010-12-29 2010-12-29 Procédés de cartographie et appareil associé

Publications (1)

Publication Number Publication Date
WO2012089258A1 true WO2012089258A1 (fr) 2012-07-05

Family

ID=44303414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/070892 WO2012089258A1 (fr) 2010-12-29 2010-12-29 Procédés de cartographie et appareil associé

Country Status (1)

Country Link
WO (1) WO2012089258A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121764A (zh) * 2016-11-26 2018-06-05 星克跃尔株式会社 图像处理装置、图像处理方法、电脑程序及电脑可读取记录介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999006943A1 (fr) * 1997-08-01 1999-02-11 Sarnoff Corporation Procede et appareil destines a effectuer un alignement multitrame, du local au global, pour former des images en mosaiques
EP1170173A2 (fr) * 2000-07-07 2002-01-09 Matsushita Electric Industrial Co., Ltd. Système et procédé de composition d'images
US20040260469A1 (en) * 2002-06-12 2004-12-23 Kazufumi Mizusawa Drive assisting system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999006943A1 (fr) * 1997-08-01 1999-02-11 Sarnoff Corporation Procede et appareil destines a effectuer un alignement multitrame, du local au global, pour former des images en mosaiques
EP1170173A2 (fr) * 2000-07-07 2002-01-09 Matsushita Electric Industrial Co., Ltd. Système et procédé de composition d'images
US20040260469A1 (en) * 2002-06-12 2004-12-23 Kazufumi Mizusawa Drive assisting system

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
COORG S ET AL: "Acquisition of a large pose-mosaic dataset", COMPUTER VISION AND PATTERN RECOGNITION, 1998. PROCEEDINGS. 1998 IEEE COMPUTER SOCIETY CONFERENCE ON SANTA BARBARA, CA, USA 23-25 JUNE 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 23 June 1998 (1998-06-23), pages 872 - 878, XP010291730, ISBN: 978-0-8186-8497-5, DOI: 10.1109/CVPR.1998.698707 *
DEZHEN SONG ET AL: "Automating inspection and documentation of remote building construction using a robotic camera", AUTOMATION SCIENCE AND ENGINEERING, 2005. IEEE INTERNATIONAL CONFERENC E ON EDMONTON, AB, CANADA AUG. 1, 2005, PISCATAWAY, NJ, USA,IEEE, 1 August 2005 (2005-08-01), pages 172 - 177, XP010834924, ISBN: 978-0-7803-9425-4, DOI: 10.1109/COASE.2005.1506764 *
JAVIER TRAVER V ET AL: "A review of log-polar imaging for visual perception in robotics", ROBOTICS AND AUTONOMOUS SYSTEMS, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 58, no. 4, 30 April 2010 (2010-04-30), pages 378 - 398, XP026929660, ISSN: 0921-8890, [retrieved on 20091030], DOI: 10.1016/J.ROBOT.2009.10.002 *
JUN CHEN ET AL: "The Research on Air Combat Environment Navigation Based on Spherical Panoramic Images", INFORMATION ENGINEERING AND COMPUTER SCIENCE (ICIECS), 2010 2ND INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 25 December 2010 (2010-12-25), pages 1 - 4, XP031841890, ISBN: 978-1-4244-7939-9 *
NI QIN ET AL: "Aligning windows of live video from an imprecise pan-tilt-zoom robotic camera into a remote panoramic display", ROBOTICS AND AUTOMATION, 2006. ICRA 2006. PROCEEDINGS 2006 IEEE INTERN ATIONAL CONFERENCE ON ORLANDO, FL, USA MAY 15-19, 2006, PISCATAWAY, NJ, USA,IEEE, 15 May 2006 (2006-05-15), pages 3429 - 3436, XP010921788, ISBN: 978-0-7803-9505-3 *
T R NEUMANN: "Modeling insect compound eyes: space-variant spherical vision", ROBOTICS AND AUTONOMOUS SYSTEMS, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, 22 November 2002 (2002-11-22), Berlin, pages 360 - 367, XP055015357, ISBN: 3540001743 *
TITUS R. NEUMANN ET AL: "Behavior-Oriented Vision for Biomimetic Flight Control", PROCEEDINGS OF THE EPSRC/BBSRC INTERNATIONAL WORKSHOP ON BIOLOGICALLY INSPIRED ROBOTICS - THE LEGACY OF W. GREY WALTER, 14 August 2002 (2002-08-14), pages 196 - 203, XP055015450, Retrieved from the Internet <URL:http://www.kyb.mpg.de/fileadmin/user_upload/files/publications/pdfs/pdf1825.pdf> [retrieved on 20111223] *
ZHENG: "Pervasive views: area exploration and guidance using extended image media.", ACM, 2 PENN PLAZA, SUITE 701 - NEW YORK USA, 6 November 2005 (2005-11-06) - 11 November 2005 (2005-11-11), pages 986 - 995, XP040031004 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121764A (zh) * 2016-11-26 2018-06-05 星克跃尔株式会社 图像处理装置、图像处理方法、电脑程序及电脑可读取记录介质
CN108121764B (zh) * 2016-11-26 2022-03-11 星克跃尔株式会社 图像处理装置、图像处理方法、电脑程序及电脑可读取记录介质

Similar Documents

Publication Publication Date Title
US11676307B2 (en) Online sensor calibration for autonomous vehicles
US11959771B2 (en) Creation and use of enhanced maps
JP6694395B2 (ja) デジタル地図に対する位置を決定する方法およびシステム
Kim et al. Ground vehicle navigation in harsh urban conditions by integrating inertial navigation system, global positioning system, odometer and vision data
US20210199437A1 (en) Vehicular component control using maps
US9366765B2 (en) Handheld GIS data collection device target augmentation
WO2010052558A2 (fr) Système et procédé pour l&#39;intégration précise d&#39;objets virtuels dans des applications de visites virtuelles panoramiques interactives
Zang et al. Accurate vehicle self-localization in high definition map dataset
US20160169662A1 (en) Location-based facility management system using mobile device
US20220179038A1 (en) Camera calibration for localization
JP4986883B2 (ja) 標定装置、標定方法および標定プログラム
JP6135972B2 (ja) 標定方法、標定プログラム、及び標定装置
US20180328733A1 (en) Position determining unit and a method for determining a position of a land or sea based object
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
JP5769149B2 (ja) モバイルマッピングシステム、及びこれを用いた沿道対象物の計測方法と、位置特定プログラム
JP6773473B2 (ja) 測量情報管理装置および測量情報管理方法
TW201024664A (en) Method of generating a geodetic reference database product
WO2012089258A1 (fr) Procédés de cartographie et appareil associé
He et al. Capturing road network data using mobile mapping technology
Kim et al. A bimodal approach for land vehicle localization
Li et al. Terrestrial mobile mapping towards real-time geospatial data collection
Niu et al. Directly georeferencing terrestrial imagery using MEMS-based INS/GNSS integrated systems
TW201024665A (en) Method of generating a geodetic reference database product
Karsznia et al. The assessment of modern photogrammetric surveying methods in road works applications
Olesk et al. Geometric and error analysis for 3D map-matching

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10798574

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/10/2013)

122 Ep: pct application non-entry in european phase

Ref document number: 10798574

Country of ref document: EP

Kind code of ref document: A1