WO2015136144A1 - Procédé et appareil pour la superposition d'images sur une carte - Google Patents
Procédé et appareil pour la superposition d'images sur une carte Download PDFInfo
- Publication number
- WO2015136144A1 WO2015136144A1 PCT/FI2014/050187 FI2014050187W WO2015136144A1 WO 2015136144 A1 WO2015136144 A1 WO 2015136144A1 FI 2014050187 W FI2014050187 W FI 2014050187W WO 2015136144 A1 WO2015136144 A1 WO 2015136144A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- captured
- map
- location
- representation
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/006—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
- G09B29/007—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
Definitions
- An example embodiment of the present invention relates generally to a method, apparatus and computer program product for superimposing an image upon a map and, more particularly, to a method, apparatus and computer program product for
- Context information may be provided in various manners including information regarding the time and date on which an image was captured, the location at which the image was captured, the subject of the image, such as a person or place, etc.
- the context information may be visually presented in the form of a map.
- reduced size representations e.g., thumbnails
- the reduced size representations of the images may be positioned upon the map so as to coincide with the location at which the images were captured.
- a user may review the map and identify the images to be further considered based upon the location at which the images were captured as represented by their relative locations upon the map.
- the reduced size representations representative of the images may largely overlap one another such that a user may have difficulty identifying and reviewing the individual images captured at a respective location.
- the reduced size representations of the images may largely obscure portions of the map by almost fully covering certain portions of the map at which many images were captured, thereby also making it difficult for the user to associate the images with a particular location upon the map.
- a method, apparatus, and computer program product are therefore provided in accordance with an example embodiment in order to provide context information to a user regarding one or more images.
- the method, apparatus, and computer program product of an example embodiment may provide the context information in a visual manner so as to facilitate the identification by a user of one or more images of interest.
- the method, apparatus and computer program product of an example embodiment may facilitate the superimposition of representations of the images upon the map in a manner that permits context information to be provided for a plurality of images with less risk of the representations of the images significantly overlapping one another or substantially obscuring a portion of a map.
- a method in an example embodiment, includes identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image. The method of this example embodiment also includes determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. In this example embodiment, the method also includes causing a map to be presented with the representation the image superimposed thereupon.
- the method of an example embodiment may determine the location of the representation the image to be superimposed upon the map to be a distance from the location in which the image was captured that has a direct relationship to the level of zoom associated with the image. In this regards, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom. Also, in regards to determining the location and size of the representation of the image to be superimposed upon the map, the method of an example embodiment may determine the size of the representation of the image to be superimposed upon the map so as to have an inverse relationship to the level of zoom associated by the image. In this regard, the size of the representation the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
- the method of an example embodiment may cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map.
- the method may cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
- an apparatus in another example embodiment, includes at least one processor and at least memory including computer program code with at least one memory and the computer code configured to, with the processor, cause the apparatus to at least identify, for an image, a location at which the image was captured, a direction along which the image was captured, and a level of zoom associated with the image.
- the at least one memory and the computer program code are also configured to, with the processor, cause the apparatus of this example embodiment to determine a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
- the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this example embodiment to cause a map to be presented with the representation of the image superimposed thereupon.
- the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the location of the representation the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
- the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom.
- the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image.
- the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
- the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map.
- the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
- a computer program product includes at least one non-transitory computer-readable storage medium having computer- executable program code portion stored therein with the computer-executable program code portions including program code instructions configured, upon execution, to identify, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image.
- the computer- executable program code portions of this example embodiment also include program code instructions configured to determine a location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
- the computer-executable program code portions of this example embodiment also include program code instructions configured, upon execution, to cause a map to be presented with the representation of the image superimposed thereupon.
- the program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map may include program code instructions configured to determine the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image. In this example embodiment, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom.
- the program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map in accordance with an example embodiment may include program code instructions configured to determine the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image. In this example embodiment, the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
- the program code instructions of an example embodiment that are configured to cause the representation of the image to be superimposed upon the map may be further configured to cause the representation of the image is superimposed upon the map so as to project out in the third dimension from the map.
- the program code instructions may be further configured to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
- an apparatus in yet another example embodiment, includes means for identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image.
- the apparatus of this example embodiment also includes means for determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
- the apparatus of this example embodiment also includes means for causing the map to be presented with the
- Figure 1 is a block diagram of an apparatus that may be specifically configured in accordance with an example embodiment of the present invention
- Figure 2 is a flow chart illustrating operations performed, such as by the apparatus of Figure 1, in accordance with an example embodiment of the present invention
- Figure 3 is a screen display of a map having a representation of three images superimposed thereupon in accordance with an example embodiment of the present invention
- Figure 4 is a representation of the screen display of Figure 3 which depicts the location at which the three images were captured and the direction along which the three images were captured relative to the location and size of the representations of the three images superimposed upon the map in accordance with an example embodiment of the present invention.
- circuitry refers to (a) hardware-only circuit implementations (for example, implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present.
- This definition of 'circuitry' applies to all uses of this term herein, including in any claims.
- the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware.
- the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
- a method, apparatus and computer program product are provided in accordance with an example embodiment of the present invention in order to provide context information for one or more images.
- the method, apparatus and computer program product of an example embodiment may provide the context information in a visual manner, such as by causing a map to be presented with a representation of an image superimposed thereupon.
- the method, apparatus and computer program product of an example embodiment of the present invention may leverage, however, the information associated with the image, such as information regarding the location at which the image was captured, the direction along which the image was captured, and the level of zoom associated with the image, in order to determine the location and size of the representation of the image to be superimposed upon the map.
- the method, apparatus and computer program product of an example embodiment may reduce the risk that the representations of the images will significantly overlap with one another and/or obscure substantial portions of the underlying map, while providing a user with a visual
- representation f additional context information so as to facilitate the user's identification and selection of an image in an efficient and intuitive manner.
- FIG. 1 depicts an apparatus 10 that may be specifically configured in accordance with an example embodiment to the present invention.
- the apparatus may be embodied by or associated with a variety of electronic devices including a mobile terminal, such as a personal digital assistant (PDA), mobile telephone, smartphone, companion device, for example, a smart watch, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, camera, audio/video player, radio, electronic book, positioning device (for example, global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text
- PDA personal digital assistant
- companion device for example, a smart watch, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, camera, audio/video player, radio, electronic book, positioning device (for example, global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text
- GPS global positioning system
- the apparatus may be embodied by or associated with a fixed computing device, such as a computer workstation, a personal computer, a server or the like.
- the apparatus of the embodiment of Figure 1 may include or otherwise be in communication with a processor 12, a memory device 14, a user interface 16 and optionally a communication interface and/or an image capturing device, e.g., a camera.
- the processor and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor
- the memory device may be non-transitory and may include, for example, one or more volatile and/or non-volatile memories.
- the memory device may be an electronic storage device (for example, a computer readable storage medium) comprising gates configured to store data (for example, bits) that may be retrievable by a machine (for example, a computing device like the processor).
- the memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention.
- the memory device could be configured to buffer input data for processing by the processor.
- the memory device could be configured to store instructions for execution by the processor.
- the memory device may store a plurality of images and associated information, e.g., metadata, as well as map data from which an image of a map is constructed.
- the images and/or the map data may be stored remotely and accessed by the processor, such as via a communication interface.
- the apparatus 10 may be embodied by various devices. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (for example, chips) including materials, components and/or wires on a structural assembly (for example, a circuit board). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
- a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
- the processor 12 may be embodied in a number of different ways.
- the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
- the processor may include one or more processing cores configured to perform independently.
- a multi-core processor may enable multiprocessing within a single physical package.
- the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
- the processor 12 may be configured to execute instructions stored in the memory device 14 or otherwise accessible to the processor.
- the processor may be configured to execute hard coded functionality.
- the processor may represent an entity (for example, physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly.
- the processor when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein.
- the processor when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed.
- the processor may be a processor of a specific device (for example, the client device 10 and/or a network entity) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein.
- the processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
- ALU arithmetic logic unit
- the apparatus 10 of the illustrated embodiment also includes or is in
- the user interface such as a display, may be in communication with the processor 12 to provide output to the user and, in some embodiments, to receive an indication of a user input, such as in an instance in which the user interface includes a touch screen display.
- the user interface may also include a keyboard, a mouse, a joystick, touch areas, soft keys, one or more microphones, a plurality of speakers, or other input/output mechanisms.
- the processor may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as a display and, in some embodiments, a plurality of speakers, a ringer, one or more microphones and/or the like.
- the processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more user interface elements through computer program instructions (for example, software and/or firmware) stored on a memory accessible to the processor (for example, memory device 14, and/or the like).
- computer program instructions for example, software and/or firmware
- the apparatus 10 of the illustrated embodiment may also optionally include a communication interface that may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a communications device in communication with the apparatus.
- the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
- the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s).
- the communication interface may alternatively or also support wired communication.
- the communication interface may include a communication modem and/or other hardware and/or software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
- DSL digital subscriber line
- USB universal serial bus
- the apparatus 10 of some embodiment may include or be associated with a camera, a video recorder or other image capturing device that is in communication with the processor 12.
- the image capturing device may be any means for capturing an image, such as still image, video images or the like, for storage, display or transmission including, for example, an imaging sensor.
- the image capturing device may include a digital camera including an imaging sensor capable of capturing an image.
- the image capturing device may include all hardware, such as a lens, an imaging sensor and/or other optical device(s), and software necessary for capturing an image.
- the image capturing device may include only the hardware needed to view an image, while the memory stores instructions for execution by the processor in the form of software necessary to capture, store and process an image.
- the image capturing device may further include a processing element such as a co-processor which assists the processor in processing image data and an encoder and/or decoder for compressing and/or decompressing image data.
- the encoder and/or decoder may encode and/or decode according to a predefined format, such as a JPEG standard format.
- the image that is captured may be stored for future viewings and/or manipulations in the memory of the apparatus and/or in a memory external to the apparatus.
- the apparatus may include means, such as the processor 12 or the like, for identifying a location at which an image was captured, a direction along which the image was captured and a level of zoom associated with the image.
- the location may be identified in various manners including, for example, latitude and longitude coordinates, an address, an offset from a predefined point of reference or the like.
- the location at which an image was captured may be automatically determined by the image capturing device at the time at which the image is captured, such as based upon a GPS device embodied by or otherwise associated with the image capturing device.
- the location may be provided by the user based upon user input at a time coincident with the capture of the image or at some time following the capture of the image.
- the direction along which the image was captured identifies the direction in which the image capturing device was pointed at the time that the image was captured.
- the direction may be defined in terms of an azimuth angle and, as described below, may optionally also include an elevation angle.
- the direction along which the image was captured may be determined automatically by the image capturing device at the time at which the image was captured, such as based upon the reading of a compass, gyroscope or the like that is embodied by or otherwise associated with the image capturing device.
- the direction may be provided by the user of the image capturing device at the time at which the image is captured or at some time thereafter.
- the level of zoom associated with the image may also be defined in various manners including, for example, the zoom ratio of the image capturing device at the time at which the image was captured. As such, the level of zoom provides an indication as to whether the image capturing device was zoomed in at the time that the image was captured or zoomed out at the time that the image was captured. Thus, larger or greater levels of zoom are associated with an instance in which the image capturing device is zoomed in and lesser or smaller levels of zoom are associated with instances in which the image capture device is zoomed out.
- the level of zoom may be determined automatically by the image capturing device or may be based upon user input defining the level of zoom at the time that the image was captured.
- the location, direction and level of a zoom of an image may be identified in various manners.
- the apparatus such as the processor 12, may be configured to identify the location, the direction and the level of zoom of the image from information provided by the image capturing device, such as based upon the information automatically captured by the image capturing device and/or provided by a user of the image capturing device.
- the location, the direction, the level of zoom and optionally other parameters associated with the image may be stored in association with the image.
- metadata may be associated with the image and stored therewith which defines the location, the direction, the level of zoom and optionally other parameters associated with the image.
- the apparatus such as the processor, may be configured to identify the location, the direction and the level of zoom associated with the image by retrieving the metadata associated with the respective image.
- the apparatus 10 may also include means, such as the processor 12 or the like, for determining a location and size of a representation of the image to be superimposed upon a map.
- the apparatus such as the processor, may be configured to determine the location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
- the apparatus such as the processor, may be configured to determine the location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
- three different images captured from the same location and along the same general direction at different levels of zoom are superimposed upon a map.
- image 30 has a broader field of view and a correspondingly lower level of zoom
- image 32 has an intermediate field of view and an intermediate level of zoom
- image 34 has a smaller field of view and a larger level of the zoom.
- Figure 4 depicts the location 36 from which the image capturing device captured the three images. Additionally, diverging lines 38 fan outwardly from the location at which the images were captured to illustrate the broadest field of view, that is, the field of view associated with image 30, and to generally represent the direction along which the images are captured.
- the direction at which image 32 was captured that is, the direction that the image capturing device was pointed at the time that image 32 was captured, is slightly to the left and along a line that is rotated counter-clockwise relative the direction that image 32 was captured.
- the direction at which image 34 was captured is slightly to the right and along a line that is rotated clockwise relative the directions that images 30 and 32 were captured.
- the apparatus 10 may determine the location of the representation of the image to be in the same direction in which the image capturing device was pointed at the time that the respective image was captured.
- image 32 is located slightly to the left of image 30
- image 34 is located slightly to the right of image 32 and slightly to the right of the center of image 30.
- the apparatus, such as the processor, of an example embodiment may determine the location of the representation of the image such that the representation of the image is located along and, in an example embodiment, centered about a line that extends in the direction along which the respective image was captured.
- the apparatus 10 In determining the location of the representation of the image to be superimposed upon the map, the apparatus 10, such as the processor 12, of an example embodiment may also be configured to determine the location of the representation of the image to be superimposed upon the map to be at a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
- the distance at which the representation of the image is located relative to the location at which the image was captured increases as the image is captured at greater levels of zoom and decreases as the image is captured at lesser levels of zoom.
- an image having a greater level of zoom may be positioned proximate the location upon the underlying map that is the subject of the zoomed image, such as shown with respect to the representation of image 34 that is located proximate that portion of the observatory depicted in the underlying map that is the subject of the zoomed in image.
- the distance at which the representation of the image is superimposed upon the map from the location at which the image was captured may be proportional to the level of zoom at which the image was captured.
- the location of the representation of the image upon the map relative to the location at which the image was captured may have other types of direct relationships to the level of zoom associated with the capture of the image in other embodiments.
- image 32 is captured with a greater level of zoom than image 30 and, as such, is located further from the location at which the images were captured than the representation of image 30.
- image 34 was captured at a greater level of zoom than either image 32 or image 30 such that the representation of image 34 is located further from the location at which the images were captured than the representations of either image 32 or image 30.
- the apparatus 10 such as the processor 12, of an example embodiment may be configured to determine the size of the representation of the image so as to have an inverse relationship to the level of zoom associated with the image.
- the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom and may conversely increase as the image is captured at lesser levels of zoom.
- image 32 is captured at a greater level of zoom than image 30 and, as such, may have a smaller size than image 30.
- image 34 may be captured at an even greater level of zoom than image 32 and image 30 and, as such, may be represented in a manner that is smaller than either images 32 or 30.
- the apparatus such as the processor, may determine the size of the representation of the image to be superimposed upon the map in a manner that has an inverse proportional relationship to the level of zoom associated with the image.
- the inverse relationship between the size of the representation of the image to be superimposed upon the map and the level of zoom associated with the image may be defined to have other types of inverse relationships in other example embodiments.
- the apparatus 10 may also include means, such as the processor 12, the user interface 16 or the like, for causing a map to be presented with the representation of the image superimposed thereupon.
- the map data from which the representation of the map is constructed for display may be stored in memory 14 or may be provided by an external database accessible by the processor via, for example, a communication interface.
- the apparatus, such as the processor, the user interface or the like may be configured to cause any of a wide variety of different representations of maps to be presented upon a display.
- the map may be a two-dimensional depiction of streets and other geographical features with the names of various cities or other regions denoted on the map.
- the map may be based upon images, such as shown in Figure 3 and provided by various commercial systems in concluding, for example, the HERE 3D Map system.
- the apparatus 10 such as the processor 12, of an example embodiment may cause the representation of the image to project outwardly in a third dimension from the map with the images positioned at a distance and with a size relative to the underlying map and the location at which the images were captured that has been determined as described above.
- the representations of the images are superimposed upon the map such that the images face the location from which the images were captured.
- the representations of the images are also presented in a manner that the images appear to stand upright and to extend outwardly from the underlying map, such as in the manner of postcards standing on an edge.
- the images may be highlighted relative to the underlying map and the representations of the images may not obscure as much of the underlying map as in an instance in which the images were laid flat or disposed in the same plane as the map.
- the method, apparatus and computer program product of an example embodiment may provide additional context for the images simply by the manner in which the images are presented such that a person viewing the images superimposed upon the map may efficiently and intuitively interpret the images relative to the underlying map and obtain information regarding the relative direction of different features and the level of detail regarding various features.
- the manner in which the location and size of the representations of the images are determined and the manner in which the representations of the images are superimposed upon the map may permit more images that were captured at the same location to be visible at one time without significant overlap since the representations of at least some of the images are spaced from the location at which the images were captured and are not stacked one upon another, thereby providing a viewer of the images and the underlying map with ready access to additional images at different levels of detail.
- the representations of the images 30, 32 and 34 depicted in the example embodiment of Figures 3 and 4 are planar, the representations of the images may have other shapes, such as curved shapes in other embodiments.
- the representations of the images may be concave with the concavity of the images facing the location from which the images were captured.
- the representations of the images 30, 32 and 34 appear to extend orthogonally from the underlying map, such as by appearing to extend only in the z-direction that projects orthogonally outward from an xy plane defined by the map
- the representations of the images may be tilted so as to define an acute angle with respect to the underlying map, such as by being tilted forwardly or rearwardly in other embodiments.
- the information regarding the direction along which the image was captured includes not only an azimuth angle, but also an elevation angle
- a non-zero elevation angle will define an instance in which the direction in which the image capturing device was pointed included an upward or downward component at the time that the image was captured.
- the apparatus 10 such as the processor 12, the user interface 16 or the like, may be configured to cause the representation of the image that projects outwardly in the third dimension from the underlying map to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
- the representation of the image may be leaned backward by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially upwardly facing.
- the representation of the image may be leaned forwardly by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially downwardly facing.
- the tilting of the representations of the images in response to a non-zero elevation angle may provide additional context information and may further facilitate the user's efficient and intuitive review and interpretation of the representations of the images superimposed upon the map.
- representations of still images may be presented upon a map.
- representations of other types of images such as video images
- a thumbnail or other representation of a video may be presented upon the map based upon the location, direction and level of zoom associated with the video such that a user may access the video by selecting, e.g., by clicking upon, the representation of the video.
- the representation of the video may be determined in various manners including by being based upon an average value of the parameter(s) that vary, the initial value of the parameter(s) that vary or the final value of the parameter(s) that vary.
- Figure 2 is a flowchart of an apparatus 10, method and computer program product according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other communication devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 14 of an apparatus employing an embodiment of the present invention and executed by a processor 12 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other
- programmable apparatus for example, hardware
- These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer- readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks.
- the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
- blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
- certain ones of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Ecology (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Automation & Control Theory (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un procédé, un appareil et un produit programme d'ordinateur destinés à fournir des informations de contexte de manière visuelle à un utilisateur relativement à une ou plusieurs images. En termes de procédé et pour une image respective, un emplacement au niveau duquel l'image a été capturée, une direction le long de laquelle l'image a été capturée et un niveau de zoom associé à l'image peuvent être identifiés. Le procédé peut également consister à déterminer un emplacement et une taille d'une représentation de l'image à superposer sur une carte sur la base de l'emplacement au niveau duquel l'image a été capturée, de la direction le long de laquelle l'image a été capturée et du niveau de zoom associé à l'image. Le procédé peut consister en outre à amener une carte à être présentée avec la représentation de l'image superposée sur celle-ci.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/FI2014/050187 WO2015136144A1 (fr) | 2014-03-14 | 2014-03-14 | Procédé et appareil pour la superposition d'images sur une carte |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/FI2014/050187 WO2015136144A1 (fr) | 2014-03-14 | 2014-03-14 | Procédé et appareil pour la superposition d'images sur une carte |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015136144A1 true WO2015136144A1 (fr) | 2015-09-17 |
Family
ID=50439408
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2014/050187 WO2015136144A1 (fr) | 2014-03-14 | 2014-03-14 | Procédé et appareil pour la superposition d'images sur une carte |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015136144A1 (fr) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090125234A1 (en) * | 2005-06-06 | 2009-05-14 | Tomtom International B.V. | Navigation Device with Camera-Info |
US20090240431A1 (en) * | 2008-03-24 | 2009-09-24 | Google Inc. | Panoramic Images Within Driving Directions |
US20130162665A1 (en) * | 2011-12-21 | 2013-06-27 | James D. Lynch | Image view in mapping |
WO2013098065A1 (fr) * | 2011-12-30 | 2013-07-04 | Navteq B.V. | Formation d'images côté chemin en superposition sur une carte |
-
2014
- 2014-03-14 WO PCT/FI2014/050187 patent/WO2015136144A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090125234A1 (en) * | 2005-06-06 | 2009-05-14 | Tomtom International B.V. | Navigation Device with Camera-Info |
US20090240431A1 (en) * | 2008-03-24 | 2009-09-24 | Google Inc. | Panoramic Images Within Driving Directions |
US20130162665A1 (en) * | 2011-12-21 | 2013-06-27 | James D. Lynch | Image view in mapping |
WO2013098065A1 (fr) * | 2011-12-30 | 2013-07-04 | Navteq B.V. | Formation d'images côté chemin en superposition sur une carte |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11595569B2 (en) | Supplying content aware photo filters | |
US9201625B2 (en) | Method and apparatus for augmenting an index generated by a near eye display | |
US9661214B2 (en) | Depth determination using camera focus | |
US9298970B2 (en) | Method and apparatus for facilitating interaction with an object viewable via a display | |
US9729645B2 (en) | Method and apparatus for obtaining an image associated with a location of a mobile terminal | |
KR20150059466A (ko) | 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치 | |
US20180196819A1 (en) | Systems and apparatuses for providing an augmented reality real estate property interface | |
CN108462818B (zh) | 电子设备及用于在该电子设备中显示360度图像的方法 | |
US9092897B2 (en) | Method and apparatus for displaying interface elements | |
US11740850B2 (en) | Image management system, image management method, and program | |
US20150235630A1 (en) | Transparency Determination for Overlaying Images on an Electronic Display | |
JP2017211811A (ja) | 表示制御プログラム、表示制御方法および表示制御装置 | |
JP6686547B2 (ja) | 画像処理システム、プログラム、画像処理方法 | |
CN103327246A (zh) | 一种多媒体拍摄处理方法、装置及智能终端 | |
JP6617547B2 (ja) | 画像管理システム、画像管理方法、プログラム | |
US20140168258A1 (en) | Method and apparatus for augmenting an image of a location with a representation of a transient object | |
US9063692B2 (en) | Method and apparatus for sharing content | |
US8867785B2 (en) | Method and apparatus for detecting proximate interface elements | |
JP2017168132A (ja) | 仮想オブジェクトの表示システム、表示システムプログラム及び表示方法 | |
WO2015136144A1 (fr) | Procédé et appareil pour la superposition d'images sur une carte | |
US9075432B2 (en) | Method and apparatus for sharing content | |
KR102605451B1 (ko) | 이미지 내에 포함된 복수의 외부 객체들 각각에 대응하는 복수의 서비스들을 제공하는 전자 장치 및 방법 | |
US9412150B2 (en) | Method and apparatus for visually representing objects with a modified height |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14715341 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14715341 Country of ref document: EP Kind code of ref document: A1 |