WO2011064895A1 - 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 - Google Patents
地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 Download PDFInfo
- Publication number
- WO2011064895A1 WO2011064895A1 PCT/JP2009/070139 JP2009070139W WO2011064895A1 WO 2011064895 A1 WO2011064895 A1 WO 2011064895A1 JP 2009070139 W JP2009070139 W JP 2009070139W WO 2011064895 A1 WO2011064895 A1 WO 2011064895A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- display
- angle
- touch panel
- overhead
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
- G01C21/3638—Guidance using 3D or perspective road maps including 3D objects and buildings
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
Definitions
- the present invention relates to a map display device that displays map data, a map display method, a map display program, and a recording medium.
- a map display device that displays map data
- a map display method that displays map data
- a map display program that uses map data
- a map display program that uses map data to display map data
- a map display program that uses map data to display map data
- a recording medium that stores map data
- map display method a map display method
- map display program a map display program
- Patent Document 1 when displaying map data in a navigation device or the like, a technique for switching display of map data from a plan view to an overhead view by moving an operation knob of an operation bar up and down is known (for example, Patent Document 1 below). reference).
- Patent Document 1 when the operation knob is located at the top of the map operation bar, a road map based on a planar map is displayed.
- the map display format is switched from the planar map to the overhead map, and the overhead map is displayed.
- the screen is further moved downward, the look-down angle of the overhead map and the height of the viewpoint are changed accordingly, and the earlier map range is displayed.
- the operation knob is moved to the bottom, an overhead map is displayed.
- JP 2007-192881 A Japanese Patent Laid-Open No. 06-282378
- a map display device includes a display unit capable of displaying a map viewed from a predetermined overhead angle with respect to the ground surface, and the display unit.
- a touch panel that is provided in an overlapping manner, a determination unit that determines contact strength when an object contacts the touch panel, and display control that controls a display mode of the map displayed on the display unit based on a determination result of the determination unit
- the display control means scrolls the map when the contact strength is lower than a predetermined strength, and changes the overhead angle when the contact strength is higher than a predetermined strength.
- a map display method is a map display method in a map display device comprising a display means and a touch panel provided on the display means, wherein the display means is applied to the ground surface.
- a map display program according to the invention of claim 7 causes a computer to execute the map display method of claim 6.
- a recording medium is characterized in that the map display program according to the seventh aspect is recorded in a computer-readable state.
- FIG. 1 is a block diagram illustrating a functional configuration of the map display device according to the embodiment.
- a map display device 100 according to the embodiment includes a display unit 101, a touch panel 102, a determination unit 103, and a display control unit 104.
- the display unit 101 displays a map viewed from a predetermined overhead angle with respect to the ground surface.
- the method for obtaining the bird's-eye view angle is arbitrary, in the present embodiment, when the reference point on the ground surface is viewed from the viewpoint position at a predetermined height from the ground surface (that is, the reference point on the ground surface is slanted).
- an angle formed by a line segment connecting the viewpoint position and the reference point and the ground surface is referred to as an overhead angle.
- the display unit 101 displays a map in which the ground surface is tilted (the tilt angle is changed) around the reference point on the display unit 101 when displaying a bird's-eye view of the map when the ground surface is viewed obliquely from above. .
- FIG. 2 is an explanatory diagram illustrating an example of how to obtain an overhead angle in the map display device.
- features such as buildings, homes, and road signs exist on the ground surface G.
- the viewpoint position is V 1
- the angle formed by the line segment L 1 connecting the viewpoint position V 1 and the reference point D and the ground surface G is ⁇ 1 .
- ⁇ 1 is approximately 90 °.
- the viewpoint position is V 2
- the angle between the line segment L 1 connecting the viewpoint position V 2 and the reference point D and the ground surface G is ⁇ 2
- ⁇ 2 ⁇ 1 the angle between the line segment L 1 connecting the viewpoint position V 2 and the reference point D and the ground surface G.
- the viewpoint V 3 indicates a viewpoint close to the ground surface, and is, for example, the same height as the viewpoint of a pedestrian or a vehicle driver.
- a map in which the ground surface (reference point) is viewed from the viewpoint V 1 is referred to as a plan view
- a map in which the ground surface (reference point) is viewed from the viewpoint V 2 is referred to as an overhead view
- a map in which the reference point is viewed from the viewpoint V 3 Becomes Driver's View. Therefore, the “map viewed from a predetermined overhead angle with respect to the ground surface” includes a map viewed from any one of the viewpoints V 1 to V 3 .
- the touch panel 102 is provided so as to overlap the display unit 101.
- the touch panel 102 is used for receiving an operation input to the map display device 100. Specifically, for example, an operation button or the like is displayed on the display unit 101, and an operation input is received by causing the user to touch (contact) a desired operation position.
- the discriminating unit 103 discriminates the contact strength when an object contacts the touch panel.
- the object is, for example, a user's hand or finger.
- the determination unit 103 determines the contact strength of the object in at least two stages (for example, whether the contact strength of the object is higher than a predetermined strength or lower than a predetermined strength).
- the display control unit 104 controls the display mode of the map displayed on the display unit 101 based on the determination result of the determination unit 103. Specifically, the display control unit 104 scrolls the map when the contact strength is equal to or lower than a predetermined strength, and changes the overhead angle when the contact strength is higher than the predetermined strength. When scrolling the map, for example, the display control unit 104 scrolls the map data in a direction from the center point of the currently displayed map data to the contact point of the object on the touch panel 102 (display unit 101).
- the display control unit 104 When changing the bird's-eye view angle, the display control unit 104 reduces the bird's-eye view angle when an object comes in contact with the area of the touch panel 102 that displays a map behind the reference point when the ground surface is inclined. That is, when a touch is made behind the reference point, the displayed map is brought closer to the driver's view. On the other hand, the display control unit 104 increases the bird's-eye angle when an object comes into contact with the area of the touch panel 102 that displays a map in front of the reference point when the ground surface is inclined. That is, when the near side of the reference point is touched, the map is brought closer to the plan view.
- the display control unit 104 may increase the speed of changing the overhead angle as the distance between the reference point and the point where the object touches the touch panel 102 is longer. In this case, if the user wants to change the bird's-eye view angle greatly, it can be efficient if it touches a point far from the reference point on the touch panel 102, and if it only wants to change a little, it is efficient if the user touches a point close to the reference point on the touch panel 102. You can change the overhead angle.
- the display control unit 104 may continue scrolling or changing the overhead angle while the contact of the object with the touch panel 102 is continued. In this case, the display control unit 104 increases the scrolling speed instead of changing the overhead angle when the contact strength becomes larger than the predetermined strength while the scrolling of the map is continued. Also good. Further, the display control unit 104 may continue the scrolling without reflecting the change in the contact strength in the process even if the contact strength becomes higher than a predetermined strength while the scrolling of the map is continued.
- FIG. 3 is a flowchart showing the procedure of map display processing by the map display device.
- the map display device 100 first displays a map viewed from a predetermined overhead angle with respect to the ground surface on the display unit 101 (step S301).
- the map displayed at this time is a map viewed from any overhead angle from the viewpoint V 1 to the viewpoint V 3 .
- the map display apparatus 100 determines whether or not an object has touched the touch panel 102 (step S302). If the object does not contact (step S302: No), the process returns to step S301 and the display of the map is continued as it is.
- step S302 when the object is in contact (step S302: Yes), the map display device 100 determines the contact strength of the object by the determination unit 103, and determines whether the contact strength is equal to or lower than a predetermined strength (step S303).
- the map display device 100 scrolls the map displayed on the display unit 101 (step S304), and ends the process according to the flowchart.
- step S303: No when the contact strength is not less than or equal to the predetermined strength (greater than the predetermined strength) (step S303: No), the map display device 100 changes the bird's-eye view angle of the displayed map (step S305), according to this flowchart. The process ends.
- the map display device 100 determines the contact strength of an object with respect to the touch panel 102, scrolls the map when the contact strength is equal to or lower than the predetermined strength, and displays the map when the contact strength is higher than the predetermined strength. Change the overhead angle. Thereby, it is not necessary to provide an interface for changing the overhead angle of the overhead view, and the display area of the display unit 101 and other operation devices can be effectively used.
- the map display device 100 can accept two types of operation inputs only with the strength of contact with the touch panel 102. Changing the strength of the contact strength with respect to the touch panel 102 can be easily performed as compared with other operations (for example, pressing two types of buttons separately). Therefore, even when the map display device 100 is mounted on a moving body such as a vehicle, the operation can be performed without hindering movement.
- the map display device 100 continues scrolling or changing the overhead angle while the object continues to touch the touch panel 102. Accordingly, it is possible to easily change the map display area and the bird's-eye view angle, or to continuously change the map display area and the bird's-eye view angle. Further, the map display device 100 increases the scrolling speed when the contact strength becomes larger than a predetermined strength while the scrolling of the map is continued. This is because when the user feels that the current scrolling speed is slow, the contact strength of the user with the touch panel 102 tends to increase. By changing the scroll speed based on the contact strength, the scroll operation can be performed more intuitively.
- the map display device 100 decreases the overhead angle when an object comes in contact with a region deeper than the reference point, and increases the overhead angle when the object comes in contact with a region closer to the reference point. .
- the overhead angle changing operation can be intuitively performed, and the user can easily change the overhead angle to a desired angle.
- the map display device 100 increases the speed of changing the overhead angle as the distance between the reference point and the point where the object touches the touch panel is longer. Thereby, a display can be changed efficiently, when changing a bird's-eye view angle largely.
- FIG. 4 is a block diagram illustrating a hardware configuration of the navigation apparatus.
- the navigation apparatus 400 includes a CPU 401, a ROM 402, a RAM 403, a recording / reproducing unit 404 for recording / reproducing various data, a recording unit 405 for recording various data, an audio I / F (interface) 406, a microphone 407, and a speaker 408.
- Each component 401 to 414 is connected by a bus 420.
- the CPU 401 is responsible for overall control of the navigation device 400.
- the ROM 402 stores programs such as a boot program, a map data display program, a route search program, and a facility search program.
- the RAM 403 is used as a work area for the CPU 401. That is, the CPU 401 controls the entire navigation device 400 by executing various programs recorded in the ROM 402 while using the RAM 403 as a work area.
- the recording / playback unit 404 controls reading / writing of data with respect to the recording unit 405 according to the control of the CPU 401.
- the recording unit 405 records data written under the control of the recording / reproducing unit 404.
- the recording / reproducing unit 404 for example, a magnetic disk drive or optical disk drive, and as the recording unit 405, for example, HD (hard disk), FD (flexible disk), flash memory, MO, SSD (Solid State Disk), memory card, etc. Can be used.
- map data includes background data that represents features (features) such as buildings, rivers, and the ground surface, and road shape data that represents the shape of the road, and is composed of multiple data files divided by district. ing.
- the road shape data further has traffic condition data.
- the traffic condition data includes, for example, whether or not there is a signal or a pedestrian crossing, whether or not there is a highway doorway or junction, the length (distance) of each link, road width, direction of travel, road type (highway, Such as toll roads and general roads).
- the functional data is three-dimensional data representing the shape of the facility on the map, character data representing the description of the facility, and other various data other than the map data.
- Map data and function data are recorded in blocks divided by district or function. Specifically, for example, the map data is recorded in a state where each block can be divided into blocks such that each represents a predetermined district on the map displayed on the display screen. Further, for example, the function data is recorded in a state where each function can be divided into a plurality of blocks so as to realize one function.
- the function data is data for realizing functions such as program data for realizing route search, calculation of required time, route guidance, and the like.
- Each of the map data and the function data is composed of a plurality of data files divided for each district or each function.
- map data is recorded in the recording unit 405.
- necessary map data may be received via the communication I / F 413 and used for various processes.
- the voice I / F 406 is connected to a microphone 407 for voice input and a speaker 408 for voice output.
- the audio I / F 406 D / A converts audio data instructed to be reproduced, and outputs the audio data from the speaker 408 as audio.
- the speaker 408 may be detachable from the navigation device 400 or may be located away from the main body of the navigation device 400.
- the microphone 407 is installed, for example, in the vicinity of the sun visor of the vehicle, collects the user's utterance, and outputs it to the audio I / F 406.
- the sound collected by the microphone 407 is A / D converted in the sound I / F 406.
- the input device 409 is configured by a remote controller having a plurality of keys for inputting characters, numerical values, various instructions, a keyboard, and a touch panel.
- the navigation apparatus 400 includes at least a touch panel using the display 411 as the input device 409. As will be described later, the touch panel of the navigation device 400 can detect that an object such as a user's finger has touched the surface of the display 411 and can also detect that an object has approached the vicinity of the surface of the display 411. .
- the video I / F 410 is connected to the display 411 and the camera 412. Specifically, the video I / F 410 is output from, for example, a graphic controller that controls the display 411, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller. And a control IC for controlling the display 411 based on the image data.
- a graphic controller that controls the display 411
- a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately
- VRAM Video RAM
- a control IC for controlling the display 411 based on the image data.
- the display 411 displays map data, icons, cursors, menus, windows, or various data such as characters and images.
- a sensor is provided on the surface of the display 411 and is used as an input device 409 (touch panel) by detecting the approach and contact of an object.
- the display unit 101 and the touch panel 102 according to the embodiment will be described as a touch panel using the display 411.
- the camera 412 captures images inside or outside the vehicle on which the navigation device 400 is mounted.
- the image captured by the camera 412 may be either a still image or a moving image.
- the video imaged by the camera 412 is recorded in the recording unit 405 or the like via the video I / F 410.
- the communication I / F 413 is connected to a network via wireless, and can transmit and receive data via the network.
- the navigation apparatus 400 can also acquire map data (including update data of map data) via the network. That is, even if map data is not recorded in the recording unit 305, map data necessary for display on the display 311 can be acquired from an external server.
- Examples of the communication network include a LAN, a WAN, a public line network, and a mobile phone network.
- the GPS unit 414 receives radio waves from GPS satellites and outputs information indicating the current location of the vehicle on which the navigation device 400 is mounted.
- the GPS unit 414 includes various sensors such as a speed sensor, an acceleration sensor, and an angular velocity sensor.
- the output information of the GPS unit 414 is used when the current position of the navigation device 400 is calculated by the CPU 401.
- the information indicating the current location is information for specifying one point on the map data, such as latitude / longitude and altitude.
- Each component of the map display device 100 shown in FIG. 1 uses the programs and data recorded in the ROM 402, RAM 403, recording unit 405, etc. in FIG. 4 to execute a predetermined program and control each unit. The function is realized.
- map display processing by the navigation device 400 displays map data around the current position of the vehicle on the display 411 when performing route guidance.
- the navigation device 400 can also read map data around an arbitrary point designated by the user from the recording unit 405 and display it on the display 411.
- map data in the navigation device 400 there are various types of display modes of map data in the navigation device 400, and an example thereof is a map obtained by viewing the ground surface from a predetermined overhead angle.
- the map changes from a close-up plan view (90 ° bird's-eye view) to the ground surface from a top view to a driver's view that looks at other points on the ground surface.
- the display mode can be changed to a close one (an angle close to an overhead angle of 0 °).
- the interface that accepts a change in the overhead angle from the user can take various forms. However, various information such as route guidance information, a distance to the destination, and a map scale change button are displayed on the display 411 of the traveling navigation device 400. For this reason, if the interface for accepting the change of the overhead angle is always displayed on the display 411, there is a possibility that the information required by the user cannot be obtained immediately, for example, the map data display area becomes narrow.
- the contact strength of the user's hand with respect to the display 411 is detected using a pressure-sensitive detection type touch panel that can detect the contact strength of the object. Then, when there is a contact with the display 411 displaying the map, the navigation device 400 processes it as an operation input for scrolling the displayed map when the contact strength is less than or equal to a predetermined strength. On the other hand, when the contact strength is higher than the predetermined strength, it is processed as an operation input for changing the overhead angle of the map. Thereby, even if it does not always display the interface which receives the change of an overhead angle, the operation input for changing an overhead angle can be received easily. Note that when the contact strength is less than or equal to the predetermined strength, the operation input for scrolling the map is handled as the operation input for scrolling the displayed map for the contact with the display 411 displaying the map. This is because it is common.
- FIG. 5 and 6 are explanatory diagrams showing an outline of a pressure-sensitive detection type touch panel.
- FIG. 5 is a view of the pressure-sensitive detection type touch panel 500 viewed from the lateral direction
- FIG. 5 is a view of the touch panel 500 viewed from the front.
- the touch panel 500 includes a touch sensor 501 that detects the contact strength of an object on the surface of the display 411.
- the touch sensor 501 can detect the contact strength of an object in at least two stages (whether it is less than a predetermined strength or greater than a predetermined strength).
- the touch sensor 501 detects the position on the touch panel 500 where the user's finger F touches.
- the touch sensor 501 takes the X coordinate in the horizontal direction and the Y coordinate in the vertical direction of the touch panel 500, for example, and the position P touched by the user's finger F is set to the coordinates (X, Y).
- FIG. 7 is a flowchart showing the procedure of map display processing by the navigation device.
- the navigation apparatus 400 displays a map with a predetermined point as a reference point on the display 411 (step S701).
- the predetermined point is, for example, a current position of the vehicle or a point designated by the user.
- the reference point may be the center of the map to be displayed, or may be another position (for example, slightly below the center of the display 411).
- the overhead angle of the overhead view displayed in step S701 is set to an angle specified by default, an angle specified by the user, an angle specified when the previous map was displayed, or the like.
- the navigation apparatus 400 determines whether or not an object (such as a user's hand) has touched the display 411 (step S702). Until there is an object contact (step S702: No), the process returns to step S701 and the map display is continued. On the other hand, when there is an object contact (step S702: Yes), it is determined whether or not the contact strength is equal to or lower than a predetermined strength (step S703).
- the contact strength of the object is determined using the detection result by the touch sensor described with reference to FIGS.
- the navigation device 400 scrolls the map in the contacted direction (step S704).
- the contacted direction is, for example, a direction from the reference point of the map data toward the contacted point on the display 411.
- the navigation device 400 changes the overhead angle of the overhead view displayed (step S705).
- the overhead angle is reduced.
- the position in contact with the object is an area for displaying a map in front of the reference point
- the overhead angle is increased.
- the bird's-eye view angle decreases, the bird's-eye view displayed on the display 411 approaches a view (driver's view) obtained by viewing the reference point from another point (viewpoint position) on the ground surface.
- the bird's-eye view angle is increased, the bird's-eye view displayed on the display 411 approaches a view (plan view) of the reference point viewed from directly above.
- step S706: No Until the display on the display 411 is changed to a display other than the overhead view (step S706: No), the navigation device 400 returns to step S701 and continues the subsequent processing. Then, when the display on the display 411 is changed to a display other than the overhead view (step S706: Yes), the processing according to this flowchart ends.
- FIG. 8 shows an overhead view 800 in which the reference point D on the ground surface is viewed at a predetermined overhead angle.
- the reference point D is the current position of the vehicle, and an icon indicating the current position of the vehicle is displayed.
- the user may press the surface of the display 411 strongly (with a strength greater than a predetermined strength).
- the overhead angle is reduced, and the overhead view displayed on the display 411 is as shown in the overhead view 900 of FIG.
- FIG. 10 is a view in which the viewpoint is farther from the ground surface than the overhead view 800 in FIG.
- the current position of the vehicle is the reference point D, but the position of the reference point D is not limited to this.
- the center of the display screen of the display may be set as the reference point D, and an icon (cross icon) in FIGS.
- the navigation device 400 may provide the reference point D at the center of the region indicating the ground surface when displaying an overhead view viewed obliquely from above. In this case, every time the overhead angle is changed, the ratio of the area indicating the ground surface in the entire screen changes (the area indicating the sky increases and the area indicating the ground surface decreases as the overhead angle decreases). Accordingly, the position of the reference point D also changes.
- the navigation device 400 may not change the overhead angle even if the area indicating the sky is strongly pressed. By doing so, the user can more clearly feel that the surface is tilted when the user presses the ground surface, and can intuitively perform an operation of changing the overhead angle.
- the bird's-eye view angle has a minimum value and a maximum value, and even when the bird's-eye view angle is the minimum value (for example, an angle close to 0 °), even if the area A1 on the back side from the reference point D is pressed strongly, the bird's-eye view angle further increases Since the angle cannot be reduced, the operation is invalidated. Similarly, even if the area A2 closer to the front side than the reference point D is strongly pressed when the overhead angle is the maximum value (for example, 90 °), the overhead angle cannot be increased any more, so the operation is invalidated.
- the minimum value for example, an angle close to 0 °
- the user may press the surface of the display 411 weakly (with a predetermined intensity or less).
- the display scrolls to a map of the area on the back side of the reference point D as shown in the overhead view 1100 of FIG.
- the reference point D is no longer the current position of the vehicle, and an icon indicating the reference point is displayed.
- the display scrolls to a map of the area on the near side of the reference point D as shown in the overhead view 1200 of FIG.
- the reference point D is not the current position of the vehicle, and an icon indicating the reference point is displayed at a position away from the icon indicating the current position of the vehicle.
- the navigation device 400 continuously performs processing based on the contact strength (changing the overhead angle or scrolling). .
- the display 411 is displayed so that the overhead angle is gradually decreased until the pressing is stopped or the overhead angle becomes the minimum value. Change.
- the map is continuously scrolled in the direction closer to the reference position D until the pressing is stopped.
- the navigation device 400 continues the process that has been started. For example, even when the contact strength changes and becomes larger than a predetermined strength while the display 411 is weakly pressed for a long time (that is, while scrolling), the navigation device 400 scrolls the map. Continue. Further, when the contact strength becomes strong during scrolling in this way, the navigation device 400 may increase the scroll speed of the map. This is because when the user feels that the current scrolling speed is slow, the contact strength of the user to the display 411 tends to increase. By changing the scroll speed based on the contact strength, the scroll operation can be performed more intuitively.
- the changing speed of the overhead angle or the scrolling speed may be changed. For example, as the distance between the touched point on the display 411 and the reference point of the map is longer, the overhead angle change speed or scroll speed is increased. Thereby, when it is desired to change the bird's-eye view angle greatly or to view a map of a point away from the currently displayed area, the display can be changed efficiently.
- the navigation device 400 determines the contact strength of the object with respect to the display 411, scrolls the map when the contact strength is equal to or lower than the predetermined strength, and looks down the map when the contact strength is higher than the predetermined strength. Change the angle. Thereby, it is not necessary to provide an interface for changing the overhead angle of the overhead view, and the display area of the display 411 and other operation devices can be used effectively.
- the navigation device 400 can accept two types of operation inputs only with the strength of contact with the display 411. Changing the strength of the contact strength with respect to the display 411 can be easily performed as compared with other operations (for example, pressing two types of buttons separately). For this reason, even in a device such as a navigation device 400 that is mounted on a moving body such as a vehicle, the operation can be performed without hindering movement.
- the navigation device 400 continues scrolling or changing the overhead angle while the contact of the object with the display 411 continues. Accordingly, it is possible to easily change the map display area and the bird's-eye view angle, or to continuously change the map display area and the bird's-eye view angle.
- the navigation device 400 increases the scrolling speed when the contact strength becomes higher than a predetermined strength while the scrolling of the map is continued. This is because when the user feels that the current scrolling speed is slow, the contact strength of the user to the display 411 tends to increase. By changing the scroll speed based on the contact strength, the scroll operation can be performed more intuitively.
- the navigation device 400 when changing the overhead angle, reduces the overhead angle when an object comes into contact with a region deeper than the reference point, and increases the overhead angle when the object comes into contact with a region in front of the reference point.
- the overhead angle changing operation can be intuitively performed, and the user can easily change the overhead angle to a desired angle.
- the navigation device 400 increases the speed of changing the overhead angle as the distance between the reference point and the point where the object touches the touch panel is longer. Thereby, a display can be changed efficiently, when changing a bird's-eye view angle largely.
- the map display method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
- This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer.
- the program may be a transmission medium that can be distributed via a network such as the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
Abstract
Description
図1は、実施の形態にかかる地図表示装置の機能的構成を示すブロック図である。実施の形態にかかる地図表示装置100は、表示部101、タッチパネル102、判別部103、表示制御部104によって構成される。
図4は、ナビゲーション装置のハードウェア構成を示すブロック図である。実施例にかかるナビゲーション装置400は、CPU401、ROM402、RAM403、各種データを記録/再生する記録再生部404、各種データを記録する記録部405、音声I/F(インターフェース)406、マイク407、スピーカ408、入力デバイス409、映像I/F410、ディスプレイ411、カメラ412、通信I/F413、GPSユニット414を備えている。各構成部401~414は、バス420によってそれぞれ接続されている。
つぎに、ナビゲーション装置400による地図表示処理について説明する。ナビゲーション装置400は、経路誘導をおこなう際、車両の現在位置周辺の地図データをディスプレイ411に表示する。また、ナビゲーション装置400は、ユーザから指定された任意の地点周辺の地図データを記録部405から読み出してディスプレイ411に表示することも可能である。
101 表示部
102 タッチパネル
103 判別部
104 表示制御部
Claims (8)
- 地表面に対して所定の俯瞰角度から眺めた地図を表示可能な表示手段と、
前記表示手段に重ねて設けられるタッチパネルと、
前記タッチパネルに物体が接触する際の接触強度を判別する判別手段と、
前記判別手段の判別結果に基づいて前記表示手段に表示する前記地図の表示態様を制御する表示制御手段と、を備え、
前記表示制御手段は、前記接触強度が所定の強度以下の場合は前記地図をスクロールし、前記接触強度が所定の強度より大きい場合は前記俯瞰角度を変更することを特徴とする地図表示装置。 - 前記表示制御手段は、前記タッチパネルに対する前記物体の接触が継続している間、前記スクロールまたは前記俯瞰角度の変更を継続することを特徴とする請求項1に記載の地図表示装置。
- 前記表示制御手段は、前記地図のスクロールが継続されている間に前記接触強度が所定の強度より大きくなった場合は前記スクロールする速度を速くすることを特徴とする請求項2に記載の地図表示装置。
- 前記表示手段は、前記地表面を斜め上方から眺めた地図の俯瞰図を表示する際に、前記表示手段上の基準点を中心として前記地表面を傾斜させた前記地図を表示し、
前記表示制御手段は、前記俯瞰角度を変更する際、前記地表面が傾斜したときに前記基準点より奥側の地図を表示する前記タッチパネルの領域に前記物体が接触すると前記俯瞰角度を小さくし、前記地表面が傾斜したときに前記基準点より手前側の地図を表示する前記タッチパネルの領域に前記物体が接触すると前記俯瞰角度を大きくすることを特徴とする請求項1~3のいずれか一つに記載の地図表示装置。 - 前記表示制御手段は、前記俯瞰角度を変更する際、前記基準点と前記タッチパネルに前記物体が接触した点との距離が遠いほど前記俯瞰角度を変更する速度を速くすることを特徴とする請求項4に記載の地図表示装置。
- 表示手段と、前記表示手段に重ねて設けられるタッチパネルと、を備える地図表示装置における地図表示方法であって、
前記表示手段に地表面に対して所定の俯瞰角度から眺めた地図を表示する表示工程と、
前記タッチパネルに物体が接触する際の接触強度を判別する判別工程と、
前記判別工程の判別結果に基づいて前記表示手段に表示する前記地図の表示態様を制御する表示制御工程と、を含み、
前記表示制御工程では、前記接触強度が所定の強度以下の場合は前記地図をスクロールし、前記接触強度が所定の強度より大きい場合は前記俯瞰角度を変更することを特徴とする地図表示方法。 - 請求項6に記載の地図表示方法をコンピュータに実行させることを特徴とする地図表示プログラム。
- 請求項7に記載の地図表示プログラムを記録したことを特徴とするコンピュータに読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010534307A JP4943543B2 (ja) | 2009-11-30 | 2009-11-30 | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 |
PCT/JP2009/070139 WO2011064895A1 (ja) | 2009-11-30 | 2009-11-30 | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 |
US13/063,879 US8922592B2 (en) | 2009-11-30 | 2009-11-30 | Map display device, map display method, map display program, and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/070139 WO2011064895A1 (ja) | 2009-11-30 | 2009-11-30 | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011064895A1 true WO2011064895A1 (ja) | 2011-06-03 |
Family
ID=44066014
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/070139 WO2011064895A1 (ja) | 2009-11-30 | 2009-11-30 | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8922592B2 (ja) |
JP (1) | JP4943543B2 (ja) |
WO (1) | WO2011064895A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013037521A (ja) * | 2011-08-08 | 2013-02-21 | Toshiba Alpine Automotive Technology Corp | 入力装置 |
WO2013046596A1 (ja) * | 2011-09-26 | 2013-04-04 | Necカシオモバイルコミュニケーションズ株式会社 | 携帯型情報処理端末 |
WO2013046426A1 (ja) * | 2011-09-30 | 2013-04-04 | パイオニア株式会社 | ヘッドアップディスプレイ、画像表示方法、画像表示プログラム、及び表示装置 |
JP2013206354A (ja) * | 2012-03-29 | 2013-10-07 | Aisin Aw Co Ltd | 画像表示装置、画像表示方法及びコンピュータプログラム |
JP2014228702A (ja) * | 2013-05-22 | 2014-12-08 | トヨタ自動車株式会社 | 地図表示制御装置 |
WO2015053040A1 (ja) * | 2013-10-11 | 2015-04-16 | 富士通テン株式会社 | 画像表示装置、画像表示システム、画像表示方法、及び、プログラム |
JP2019032886A (ja) * | 2018-10-24 | 2019-02-28 | パイオニア株式会社 | 表示制御装置、表示制御方法、および、表示制御装置用プログラム |
JP2019079152A (ja) * | 2017-10-20 | 2019-05-23 | ヤフー株式会社 | 情報制御プログラム、情報制御方法および端末装置 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10191641B2 (en) | 2011-12-29 | 2019-01-29 | Apple Inc. | Device, method, and graphical user interface for navigation of information in a map-based interface |
US9418672B2 (en) | 2012-06-05 | 2016-08-16 | Apple Inc. | Navigation application with adaptive instruction text |
US8880336B2 (en) | 2012-06-05 | 2014-11-04 | Apple Inc. | 3D navigation |
US9367959B2 (en) * | 2012-06-05 | 2016-06-14 | Apple Inc. | Mapping application with 3D presentation |
US8965696B2 (en) | 2012-06-05 | 2015-02-24 | Apple Inc. | Providing navigation instructions while operating navigation application in background |
US9111380B2 (en) | 2012-06-05 | 2015-08-18 | Apple Inc. | Rendering maps |
US9135751B2 (en) | 2012-06-05 | 2015-09-15 | Apple Inc. | Displaying location preview |
US9311750B2 (en) | 2012-06-05 | 2016-04-12 | Apple Inc. | Rotation operations in a mapping application |
US9159153B2 (en) | 2012-06-05 | 2015-10-13 | Apple Inc. | Method, system and apparatus for providing visual feedback of a map view change |
US9541417B2 (en) | 2012-06-05 | 2017-01-10 | Apple Inc. | Panning for three-dimensional maps |
US10156455B2 (en) | 2012-06-05 | 2018-12-18 | Apple Inc. | Context-aware voice guidance |
US9997069B2 (en) | 2012-06-05 | 2018-06-12 | Apple Inc. | Context-aware voice guidance |
US9482296B2 (en) | 2012-06-05 | 2016-11-01 | Apple Inc. | Rendering road signs during navigation |
US9269178B2 (en) | 2012-06-05 | 2016-02-23 | Apple Inc. | Virtual camera for 3D maps |
US9886794B2 (en) | 2012-06-05 | 2018-02-06 | Apple Inc. | Problem reporting in maps |
US10176633B2 (en) | 2012-06-05 | 2019-01-08 | Apple Inc. | Integrated mapping and navigation application |
EP2926317B1 (en) * | 2012-12-03 | 2020-02-12 | Harman International Industries, Incorporated | System and method for detecting pedestrians using a single normal camera |
US9418485B2 (en) | 2013-05-31 | 2016-08-16 | Apple Inc. | Adjusting heights for road path indicators |
US9678651B2 (en) | 2013-06-08 | 2017-06-13 | Apple Inc. | Mapping application with interactive compass |
US9903735B2 (en) | 2015-03-30 | 2018-02-27 | International Business Machines Corporation | Route stabilization scrolling mode |
US10740972B2 (en) * | 2017-04-28 | 2020-08-11 | Harman International Industries, Incorporated | System and method for presentation and control of augmented vehicle surround views |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06282378A (ja) * | 1993-03-30 | 1994-10-07 | Toshiba Corp | 表示監視装置 |
JP2006330428A (ja) * | 2005-05-27 | 2006-12-07 | Xanavi Informatics Corp | 地図表示装置 |
JP2007034101A (ja) * | 2005-07-29 | 2007-02-08 | Matsushita Electric Ind Co Ltd | データ処理装置 |
JP2007094708A (ja) * | 2005-09-28 | 2007-04-12 | Kddi Corp | 情報端末装置 |
JP2007192881A (ja) * | 2006-01-17 | 2007-08-02 | Xanavi Informatics Corp | 車載地図表示装置 |
JP2007328570A (ja) * | 2006-06-08 | 2007-12-20 | Xanavi Informatics Corp | 地図表示装置 |
JP2009140368A (ja) * | 2007-12-07 | 2009-06-25 | Sony Corp | 入力装置、表示装置、入力方法、表示方法及びプログラム |
JP2009157908A (ja) * | 2007-12-07 | 2009-07-16 | Sony Corp | 情報表示端末、情報表示方法、およびプログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4790028A (en) * | 1986-09-12 | 1988-12-06 | Westinghouse Electric Corp. | Method and apparatus for generating variably scaled displays |
US6654014B2 (en) * | 1995-04-20 | 2003-11-25 | Yoshinori Endo | Bird's-eye view forming method, map display apparatus and navigation system |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US9513765B2 (en) | 2007-12-07 | 2016-12-06 | Sony Corporation | Three-dimensional sliding object arrangement method and system |
JP4605214B2 (ja) * | 2007-12-19 | 2011-01-05 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
US20100309228A1 (en) * | 2009-06-04 | 2010-12-09 | Camilo Mattos | Displaying Multi-Dimensional Data Using a Rotatable Object |
US8363020B2 (en) * | 2009-08-27 | 2013-01-29 | Symbol Technologies, Inc. | Methods and apparatus for pressure-based manipulation of content on a touch screen |
-
2009
- 2009-11-30 US US13/063,879 patent/US8922592B2/en active Active
- 2009-11-30 JP JP2010534307A patent/JP4943543B2/ja not_active Expired - Fee Related
- 2009-11-30 WO PCT/JP2009/070139 patent/WO2011064895A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06282378A (ja) * | 1993-03-30 | 1994-10-07 | Toshiba Corp | 表示監視装置 |
JP2006330428A (ja) * | 2005-05-27 | 2006-12-07 | Xanavi Informatics Corp | 地図表示装置 |
JP2007034101A (ja) * | 2005-07-29 | 2007-02-08 | Matsushita Electric Ind Co Ltd | データ処理装置 |
JP2007094708A (ja) * | 2005-09-28 | 2007-04-12 | Kddi Corp | 情報端末装置 |
JP2007192881A (ja) * | 2006-01-17 | 2007-08-02 | Xanavi Informatics Corp | 車載地図表示装置 |
JP2007328570A (ja) * | 2006-06-08 | 2007-12-20 | Xanavi Informatics Corp | 地図表示装置 |
JP2009140368A (ja) * | 2007-12-07 | 2009-06-25 | Sony Corp | 入力装置、表示装置、入力方法、表示方法及びプログラム |
JP2009157908A (ja) * | 2007-12-07 | 2009-07-16 | Sony Corp | 情報表示端末、情報表示方法、およびプログラム |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013037521A (ja) * | 2011-08-08 | 2013-02-21 | Toshiba Alpine Automotive Technology Corp | 入力装置 |
WO2013046596A1 (ja) * | 2011-09-26 | 2013-04-04 | Necカシオモバイルコミュニケーションズ株式会社 | 携帯型情報処理端末 |
WO2013046426A1 (ja) * | 2011-09-30 | 2013-04-04 | パイオニア株式会社 | ヘッドアップディスプレイ、画像表示方法、画像表示プログラム、及び表示装置 |
JP2013206354A (ja) * | 2012-03-29 | 2013-10-07 | Aisin Aw Co Ltd | 画像表示装置、画像表示方法及びコンピュータプログラム |
JP2014228702A (ja) * | 2013-05-22 | 2014-12-08 | トヨタ自動車株式会社 | 地図表示制御装置 |
WO2015053040A1 (ja) * | 2013-10-11 | 2015-04-16 | 富士通テン株式会社 | 画像表示装置、画像表示システム、画像表示方法、及び、プログラム |
JP2015076062A (ja) * | 2013-10-11 | 2015-04-20 | 富士通テン株式会社 | 画像表示装置、画像表示システム、画像表示方法、及び、プログラム |
US10857974B2 (en) | 2013-10-11 | 2020-12-08 | Fujitsu Ten Limited | Image display device, image display system, image display method and program |
US11643047B2 (en) | 2013-10-11 | 2023-05-09 | Fujitsu Ten Limited | Image display device, image display system, image display method and program |
JP2019079152A (ja) * | 2017-10-20 | 2019-05-23 | ヤフー株式会社 | 情報制御プログラム、情報制御方法および端末装置 |
JP2019032886A (ja) * | 2018-10-24 | 2019-02-28 | パイオニア株式会社 | 表示制御装置、表示制御方法、および、表示制御装置用プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011064895A1 (ja) | 2013-04-11 |
US20110249030A1 (en) | 2011-10-13 |
US8922592B2 (en) | 2014-12-30 |
JP4943543B2 (ja) | 2012-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4943543B2 (ja) | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 | |
CN110618800B (zh) | 一种界面显示的方法、装置、设备和存储介质 | |
JP4111885B2 (ja) | 地図検索表示方法及び装置 | |
JP5174704B2 (ja) | 画像処理装置および画像処理方法 | |
EP2360557A1 (en) | Input device, apparatus for monitoring area around vehicle, method for selecting icon switch, and program | |
JP6429886B2 (ja) | 触感制御システムおよび触感制御方法 | |
JP5754410B2 (ja) | 表示装置 | |
JP4067374B2 (ja) | 画像処理装置 | |
JP5358215B2 (ja) | 地図表示装置 | |
KR20150005386A (ko) | 멀티 터치 입력을 지원하는 터치스크린을 구비한 모바일 단말기 및 그의 제어 방법 | |
JP5453069B2 (ja) | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 | |
JP2019096026A (ja) | 操作受付システムおよび操作受付プログラム | |
JPWO2009028085A1 (ja) | 地図表示装置、地図表示方法、及び地図表示プログラム | |
JP2011080851A (ja) | ナビゲーション装置および地図画像表示方法 | |
JP5389624B2 (ja) | 情報処理装置、入力制御方法、入力制御プログラムおよび記録媒体 | |
JP3337144B2 (ja) | ナビゲーション装置 | |
JP5933468B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
JP5665838B2 (ja) | 画像処理装置、画像表示方法、および、プログラム | |
JP5453070B2 (ja) | 地図表示装置、地図表示方法、地図表示プログラムおよび記録媒体 | |
JP2005128791A (ja) | 表示装置 | |
WO2015151154A1 (ja) | 表示装置、表示方法および表示プログラム | |
WO2007123104A1 (ja) | 経路誘導装置、経路誘導方法、経路誘導プログラム、および記録媒体 | |
WO2007105500A1 (ja) | ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよびコンピュータに読み取り可能な記録媒体 | |
JP2019036325A (ja) | 表示装置、表示方法および表示プログラム | |
JP2011117742A (ja) | 情報処理装置、入力方法、入力プログラムおよび記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2010534307 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13063879 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09851684 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: COMMUNICATION NOT DELIVERED. NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112 EPC (EPO FORM 1205A DATED 12.09.2012) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09851684 Country of ref document: EP Kind code of ref document: A1 |