JP2009098086A - Navigation device and scroll method - Google Patents

Navigation device and scroll method Download PDF

Info

Publication number
JP2009098086A
JP2009098086A JP2007272058A JP2007272058A JP2009098086A JP 2009098086 A JP2009098086 A JP 2009098086A JP 2007272058 A JP2007272058 A JP 2007272058A JP 2007272058 A JP2007272058 A JP 2007272058A JP 2009098086 A JP2009098086 A JP 2009098086A
Authority
JP
Japan
Prior art keywords
scale
scroll
navigation device
map
operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007272058A
Other languages
Japanese (ja)
Inventor
Yukiya Denda
志哉 傳田
Original Assignee
Xanavi Informatics Corp
株式会社ザナヴィ・インフォマティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xanavi Informatics Corp, 株式会社ザナヴィ・インフォマティクス filed Critical Xanavi Informatics Corp
Priority to JP2007272058A priority Critical patent/JP2009098086A/en
Publication of JP2009098086A publication Critical patent/JP2009098086A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To simultaneously perform scrolling and scale changing of map display, using a simple operation. <P>SOLUTION: In a navigation device, an enlargement scroll button 310 and a reduction scroll button 312 are displayed on a display 12, on which the map 300 and a current position mark 302 are displayed. When a user touches a region where the enlargement scroll button 310 or the reduction scroll button 312 is displayed, the screen is scrolled, in a direction which corresponds to the position of the button. Furthermore, at the same time as (in parallel with) scrolling, the scale change, to a scale corresponding to the button (enlargement or reduction) is performed. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to a navigation device, and more particularly to a navigation device capable of changing a scale of map information and scrolling by user operation.

  Conventionally, an in-vehicle device such as a car navigation system displays a map or an operation menu on a display, and for example, a user touch panel (touch screen) pasted on the display, a hard switch on an operation panel, or the like. Accept the operation. For example, in Patent Document 1, the display map is scrolled at a fixed scale by operating a scroll switch.

JP-A-9-62185

  By the way, for example, the user performs scrolling and scale changing operations in order to display a map of a place that is not displayed in the display area of the display, such as a destination or a road in a traveling direction. In addition, for example, the user can display a map with a reduced scale (wide area display) to know the general route from the current location to the destination, road conditions, etc. In order to know detailed information on the map, the scale of the map is enlarged and displayed (detailed display). For this reason, the user may perform either one of scrolling or scale change or both operations depending on information that the user wants to obtain.

  However, conventionally, the operation for scrolling and the operation for changing the scale are independent of each other. Therefore, when the user wants to change the scale of the map of the place to be displayed by scrolling, the user performs the scroll operation after performing the scale change operation in advance, or the scale change after the scroll operation is performed. There is a problem that it is necessary to perform an operation and is troublesome. In addition, such troublesome operations are a burden on the user, and there is a problem that driving safety is reduced.

  The present invention has been made to solve the above-described problems, and an object thereof is to provide a technique for simultaneously performing scrolling and scale change with a simple operation.

  In order to solve the above-described problem, an aspect of the present invention is a navigation device that guides a user by displaying a map on a display, the detection unit detecting a user operation, and the operation detected by the detection unit Display processing means for scrolling the map in accordance with the contents of the display and displaying the scale by changing the scale in accordance with the length of the scroll time.

  In the above navigation device, the display processing means may be configured to change the scale of the map smaller as the scroll time is longer.

  Further, in the above navigation device, the detection unit displays a screen for accepting a user operation, detects a user operation including a scroll direction and a scale change direction, and the display processing unit The map can be scrolled in the scroll direction detected by the detection means, and the scale can be changed and displayed in the scale change direction detected by the detection means.

  In addition, the detection means includes a first area for receiving a scale decrease and a scroll direction and a second area for receiving a scale increase and a scroll direction on a screen for receiving a user operation. The direction of the scale change in the area to which the touch position belongs and the scroll direction corresponding to the direction from the predetermined center position on the screen to the touch position are detected by the user's touch operation on any one of the areas. It can be constituted as follows.

  In addition, the detection means includes a screen for accepting a user operation, in which a button for simultaneously accepting a change direction of the scale and a scroll direction is arranged at a position corresponding to the scroll direction with reference to a predetermined center position of the screen. And the direction of the scale change determined for the button and the scroll direction corresponding to the direction from the predetermined center position of the screen to the position of the button are detected by the user's touch operation on the button. Can be configured.

  The display processing means may be configured to change the scale with the set upper limit as an upper limit according to the length of time of the touch operation detected by the detection means.

  The display processing unit may be configured to update the set upper limit value when a predetermined operation is detected while the touch operation is continuously detected by the detection unit. .

  In addition, scale storage means for storing the set number of times for each scale, further, when the display processing means to change the scale and start the scroll according to the operation detected by the detection means, The scale acquired from the scale storage means can be configured to be set in advance as an upper limit value.

  Further, the display processing means can be configured to acquire the scale having the largest set number of times from the scale storage means.

  Also, a scroll method in a navigation device for guiding a user by displaying a map on a display, wherein the navigation device detects a user's operation according to the content of the operation detected in the detection step. The display step of scrolling the map and changing the scale according to the length of the scroll time is executed.

  A first embodiment of the present invention will be described below with reference to the drawings.

  FIG. 1 is a block diagram showing a hardware configuration and a functional configuration of a navigation apparatus to which the first embodiment of the present invention is applied. First, the hardware configuration of the navigation device 1 will be described.

  The navigation device 1 is a device that is mounted on a vehicle or the like and performs navigation processing for displaying traffic information such as map information, route information, and traffic jam information to guide a user. Of course, the navigation device 1 may not be provided in the vehicle, and may be a portable PND (Personal Navigation Device), for example. The navigation device 1 includes a control device 10, a display unit 11, an input device 14, a voice input / output device 16, a storage device 18, various sensors 20, a GPS (Global Positioning System) receiving device 22, and FM multiplexing. It has a broadcast / beacon receiving device 24 and a communication device 26. Of course, the configuration of the navigation device 1 is not limited to the above, and may include, for example, a tuner for digital terrestrial broadcasting.

  The control device 10 is a central unit that controls each of the above devices and performs various processes. The control device 10 includes a CPU (Central Processing Unit) and a memory such as a RAM (Random Access Memory) and a ROM (Read Only Memory). In addition, an interface (I / F) for controlling communication with other devices is provided. For example, the control device 10 calculates the current location using information output from the various sensors 20 and the GPS receiving device 22. Further, map information necessary for display is read from the storage device 18 based on the obtained current location information. Also, the read map information is developed in graphics, and a mark indicating the current location is superimposed on the map information and displayed on the display 12. In addition, the map information is used to calculate an optimum route (recommended route) that connects the starting point (current location) instructed by the user and the destination, and displays the calculated route on the display 12. The voice input / output device 16 is used to output a voice for guiding the user. Further, the user's request is accepted via the touch input detection device 13 or the input device 14 and processing corresponding to the request is executed. Further, traffic information and the like are received from the information center via the FM multiplex broadcast / beacon receiving device 24 and the communication device 26 and displayed on the display 12.

  The display unit 11 is a unit that displays graphics information generated by the control device 10, and includes a display 12 and a touch input detection device 13. The display 12 is composed of, for example, a CRT display or a liquid crystal display. The touch input detection device 13 is a so-called touch panel that is mounted on the display side surface of the display 12 and transmits the display screen. The touch input detection device 13 specifies a touch position corresponding to the XY coordinates of the image displayed on the display 12, converts the touch position into coordinates, and outputs the coordinate. The touch input detection device 13 includes, for example, a pressure-sensitive or electrostatic input detection element.

  The input device 14 is a device for receiving an instruction from a user. The input device 14 is configured by, for example, a hard switch such as a joystick or a keyboard.

  The voice input / output device 16 converts the message to the user generated by the control device 10 into a voice signal and outputs it. Moreover, the process which recognizes the voice which the user uttered and transfers the content to the control apparatus 10 is performed.

  The storage device 18 includes programs and data (not shown) necessary for the control device 10 to execute various processes, map information 1810 used for navigation processing, and a voice dictionary (not shown) used for voice recognition. ) And so on. These pieces of information are read out on the memory by the CPU of the control device 10 and used. The storage device 18 includes, for example, an HDD (Hard Disk Drive), a CD-ROM, a DVD-ROM, or the like. The map information will be described with reference to FIG.

  As shown in FIG. 2, the map information 1810 includes, for each mesh identification code (mesh ID) 1811 that is a partitioned area on the map, link data 1820 of each link constituting the road included in the mesh area. Contains. For each link ID 1821, the link data 1820 includes coordinate information 1822 of two nodes (start node and end node) constituting the link, road type information 1823 including the link, link length information 1824 indicating the link length, link The travel time 1825 includes a link ID (connection link ID) 1826 of a link connected to each of the two nodes, a lane number 1827 indicating the number of lanes included in the link, and the like. The link road type stores information indicating whether the road is a toll road. Of course, the configuration of the map information is not limited to the above.

  Returning to FIG. 1, the various sensors 20 are devices that collect data for calculating the current position of the vehicle, and include, for example, a vehicle speed sensor, a gyro sensor, and the like. The collected data is sent to the control device 10 and used for navigation processing.

  The GPS receiver 22 is a device for receiving a signal from a GPS satellite and generating position information indicating the current position of the vehicle. The generated position information is sent to the control device 10 and used for navigation processing.

  The FM multiplex broadcasting / beacon receiving device 24 uses FM multiplex broadcasting radio waves, radio beacons or optical beacons installed on the road (none of which are shown) as traffic information sent from a traffic information distribution center or the like. Device for receiving via. The received traffic information is sent to the control device 10 and used for displaying the traffic information.

  The communication device 26 is configured by a general mobile phone, for example, and is a device for receiving traffic information sent from a traffic information distribution center or the like via a mobile phone line and a base station. In addition, the traffic information collected by the vehicle is transmitted to the traffic information distribution center.

  Next, the functional configuration of the navigation device 1 will be described with reference to FIG.

  As shown in the figure, the control device 10 includes a main control unit 100, a current location calculation unit 102, a route calculation unit 104, a display processing unit 106, and a user operation analysis unit 108. These functional units are constructed by the CPU of the control device 10 loading a program and data necessary for executing the program from the storage device 18 and the ROM onto the RAM and executing the program. In addition, the control apparatus 10 may be provided with the scale learning part 110 so that it may mention later.

  The main control unit 100 is a central functional unit that performs various processes, and controls other processing units according to the processing content. The main control unit 100 also performs navigation processing (for example, display of traffic information, display of the current position, route search, route guidance, etc.), which is the original basic operation of the navigation device 1. Specifically, map information around the current location calculated by the current location calculation unit 102 is read from the storage device 18. Then, the display processing unit 106 is instructed to display the current location information superimposed on the read map information. If there is route information calculated by the route calculation unit 104, the route may be displayed. The voice input / output device 16 is used to output a voice for guiding the user. In addition, processing corresponding to the user operation received via the user operation analysis unit 108 is performed.

  The current location calculation unit 102 calculates the current location using information output from the various sensors 20 and the GPS receiver 22 at predetermined intervals (for example, every predetermined distance and every predetermined time). The calculated current location information is sent to the main control unit 100, the route calculation unit 104, etc., and used for those processes.

  The route calculation unit 104 searches for a recommended route connecting between two designated points (current location, destination). Specifically, first, the current location calculated by the current location calculation unit 102 and the destination input via the touch input detection device 13, the input device 14, and the voice input / output device 16 are acquired. Further, the map information is read from the storage device 18. Then, using the Dijkstra method etc., the route connecting the two points (current location and destination) is converted into the link cost (for example, distance and travel time) of the road (link) connecting the predetermined point (node). Search for a route whose total cost on the route is the lowest for other routes. An optimum route may be searched using the traffic information acquired via the FM multiplex broadcast / beacon receiving device 24.

  The display processing unit 106 receives an instruction from another functional unit, generates a drawing command for causing the display 12 to display a screen, and outputs the drawing command. For example, a map drawing command is generated so that marks such as roads, other map components, current location, destination, and arrows for recommended routes are drawn with a designated scale and drawing method. In addition, a drawing command is generated so as to draw an image such as a menu item for accepting a user instruction or an operation button. The map information 1810 stores map drawing data of a plurality of scales (for example, 1/1 million, 1 / 500,000,... 1/5000, 1/2500, etc.).

  The user operation analysis unit 108 receives a user operation input via the touch input detection device 13 or the input device 14, analyzes the operation content, and executes a process corresponding to the operation content. Control the functional part of Further, the corresponding operation content is analyzed from the words that are input via the voice input device 16 and recognized by voice, and other functional units are controlled so that processing corresponding to the operation content is executed. For example, when the user requests a search for a recommended route, the main control unit 100 is requested to display a map information or a dialog for receiving a destination setting. Further, when the user performs a scroll operation or a scale change operation, the main control unit 100 is requested to display the map by scrolling or changing the scale.

  Next, with reference to FIGS. 3 to 9, characteristic operations executed in the navigation device 1 will be described.

  FIG. 3 is a flowchart showing the flow of processing of enlargement scroll and reduction scroll. This flow starts when the navigation device 1 is activated. As will be described later, enlargement scrolling means that processing for increasing the scale of a map and processing for scrolling a map are performed in parallel. Reduction scrolling means that the process of reducing the scale of the map and the process of scrolling the map are performed in parallel.

  Here, the enlargement / reduction scroll process and the normal scroll process without changing the scale are, for example, instructions for operating the scroll function via the touch input detection device 13 or the input device 14 by a user operation. It is assumed that it will be in a standby state when it is done. In this state, the display 12 displays a UI (User Interface) screen as shown in FIG.

  FIG. 4 is a diagram illustrating an example of a UI (User Interface) screen for scale change and scroll operation. As shown in FIG. 4A, a map 300 and a current position mark 302 are displayed on the display 12. Of course, route information or the like may be displayed. In addition, an enlargement scroll button 310 and a reduction scroll button 312 are displayed on the display 12. Here, “+” means enlargement, and “−” means reduction. The enlargement scroll button 310 and the reduction scroll button 312 are displayed in four directions corresponding to the upper, lower, left and right sides of the display 12, respectively. The enlargement scroll button 310 is displayed outside the reduction scroll button 312. Of course, the shape, arrangement, and notation of buttons are not limited to this. For example, the reduction scroll button 312 may be disposed outside the enlargement scroll button 310. Further, for example, as shown in FIG. 4B, the display may be performed in the directions of the enlargement scroll button 310 and the reduction scroll button 312, 8 respectively.

  An overview of processing when the above-described enlargement scroll button 310 or reduction scroll button 312 is touched will be described. When the user touches the area where the enlargement scroll button 310 or the reduction scroll button 312 is displayed, the screen is scrolled in the direction corresponding to the position of the button. Simultaneously with the scrolling (in parallel), the scale is changed to the scale (enlargement or reduction) corresponding to the button. Thus, the user can perform scrolling and scale changing operations with a single button.

  Note that the UI screen that accepts the scale change and the scroll operation is not limited to the configuration in which buttons corresponding to the scroll direction are provided, as shown in FIGS. For example, as shown in FIG. 4C, an enlarged scroll area 315 (an area between dotted lines 314 and 316) and a reduced scroll area 317 (enclosed by a dotted line 316) for accepting a scale change and a scroll operation. (Provided region) may be provided. The enlarged scroll area 315 and the reduced scroll area 317 are band-like areas having a predetermined width that surround the center of the display 12 in a circle. In this case, when the user performs a touch operation in the enlargement scroll area 315 or the reduction scroll area 317, the screen is scrolled in the direction from the center of the display 12 toward the touch position. Simultaneously with the scrolling, the scale is changed to the scale (enlarged or reduced) indicated by the area to which the touch position belongs. The enlargement scroll button 310 and the reduction scroll 312 may be displayed as a guide for the user to recognize the enlargement scroll area 315 and the reduction scroll 317. Further, the scroll speed may be increased as the position is farther from the center of the display 12 on the enlarged scroll area 315 or the reduced scroll area 317. It should be noted that the arrangement of the enlarged scroll area 315 or the reduced scroll area 317 and the scroll direction may not be based on the center of the display 12, but may be based on the center of the displayed map, for example.

  Also, in FIGS. 4A to 4C, normal scrolling without changing the scale may be received in an area other than the area for accepting the operations of enlargement scroll and reduction scroll. That is, when a user's touch operation is performed within the area, the screen is scrolled in the direction from the center of the display 12 toward the touch position.

  Returning to FIG. 3, when this flow is started, the user operation analysis unit 108 determines whether or not the user's touch operation has been turned on via the touch input detection device 13 (S <b> 100). If it is determined that the touch operation is in the ON state (YES in S100), the process proceeds to S110. On the other hand, when it is determined that the touch operation is not ON (NO in S100), S100 is executed.

  When it is determined that the touch operation is in the ON state (YES in S100), the user operation analysis unit 108 determines whether the operation is a normal scroll operation (S110). Specifically, the user operation analysis unit 108 uses the coordinates of the touch position output by the touch input detection device 13 to determine whether or not the position is within an area for accepting an enlarged scroll or a reduced scroll. To do. If it is determined that the position is within the area (not normal scrolling) (NO in S110), the process proceeds to S120. On the other hand, if it is determined that the position is not within the area (normal scrolling) (YES in S110), the process proceeds to S300.

  When it is determined that the touch position is within the region for accepting the enlarged scroll or the reduced scroll (NO in S110), the user operation analysis unit 108 performs the enlargement scroll or the reduction scroll to display the map. 100 is instructed (S120). Specifically, the main control unit 100 reads the map information 1810 and changes the scale to the scale (enlarged or reduced) indicated by the area to which the touch position belongs, from the center of the display 12 to the touch position. Scroll in the direction of heading, and display a map on the display 12 via the display processing unit 106. When the scale reaches a predetermined value, only scrolling is performed with the scale. Here, the predetermined value is an upper limit value of a displayable scale (enlargement or reduction).

  While the map display is being expanded or reduced (only scrolling when the scale reaches a predetermined value) (S120), the user operation analysis unit 108 determines whether the user's touch operation has been turned off. Is determined (S130). If it is determined that the touch operation is not in the OFF state (NO in S130), the process proceeds to S120. On the other hand, if it is determined that the touch operation is in the OFF state (YES in S130), the process proceeds to S140.

  When it is determined that the touch operation is in the OFF state (YES in S130), the main control unit 100 stops the enlargement or reduction scroll, and the enlargement or reduction scroll is started around the currently displayed map. The map is displayed while changing the scale up to the time scale (S140). If the scale has been changed to the original scale, the process proceeds to S100.

  While the map display is being enlarged or reduced (S140), the user operation analysis unit 108 determines whether or not the user's touch operation has been turned on (S150). If it is determined that the touch operation is in the ON state (YES in S150), the process proceeds to S110. On the other hand, if it is determined that the touch operation is not in the ON state (NO in S150), the process proceeds to S140.

  On the other hand, when it is determined that the touch position is not within the area for accepting the enlarged or reduced scroll (YES in S110), the user operation analysis unit 108 causes the main control unit 100 to display the map by performing normal scrolling. Instruct (S300). Specifically, the main control unit 100 reads the map information 1810, scrolls in the direction from the center of the display 12 toward the touch position, and displays the map on the display 12 via the display processing unit 106.

  While the map display is being scrolled (S300), the user operation analysis unit 108 determines whether or not the user's touch operation has been turned off (S310). If it is determined that the touch operation is not in the OFF state (NO in S310), the process proceeds to S300. On the other hand, if it is determined that the touch operation is in the OFF state (YES in S310), the process proceeds to S100.

  As described above, enlargement scroll and reduction scroll processing is executed.

  Next, operations of the above-described enlargement scroll and reduction scroll will be described more specifically with reference to FIGS.

  FIG. 5 is a diagram for explaining operations of enlargement scroll and reduction scroll. The horizontal axis shows the passage of time. The vertical axis indicates changes in scale (enlargement and reduction) and changes in touch operation (ON, OFF). In this figure, the scroll direction is not shown. As described above, the scrolling is performed in the direction corresponding to the button touched by the user or in the direction corresponding to the touch position. 6 and 7 are diagrams showing screen transition examples of the reduced scroll. 8 and 9 are diagrams showing examples of screen transitions of enlarged scroll.

  FIG. 5A shows a reduction scroll operation. As shown in the figure, the reduction scroll is performed while the touch operation of the reduction scroll is in the ON state (between a and b, S120 in FIG. 3). That is, the map is reduced according to the length of the scroll time. Further, when the touch operation is turned off, the map is enlarged to a predetermined scale (for example, default value, user setting value, etc.) (b-c, S140 in FIG. 3).

  Specifically, when the touch operation is turned on (time point a), the reduced scrolling is started, and scales that are predetermined as being capable of displaying a map (for example, 1/2500, 1/5000, 1/1). The map is scrolled (between a and b) while being reduced in a stepwise manner (wide area display) by 10,000, 1/2000, 1/4000, etc. For example, in the case of scrolling to the right, the screen changes as shown in FIGS. 6 (A-1) to (A-3). Note that a building mark 320 is shown for easy understanding of the transition. When the scale reaches the upper limit that can be displayed, the map is scrolled while maintaining the scale. When the touch operation is turned off (time point b), scrolling is stopped, and the map is enlarged step by step up to the scale at the time when the reduction is started, centering on the map displayed at that time. while). For example, when the reduction scroll in the right direction is finished, the screen changes as shown in FIGS. 7 (A-4) to (A-5). When the original scale is reached, the map is displayed with the scale maintained (after time c).

  FIG. 5B shows the operation of enlargement scrolling. As shown in the figure, the enlarged scroll is performed while the touch operation of the enlarged scroll is ON (between a and b, S120 in FIG. 3). That is, the map is enlarged according to the length of the scroll time. When the touch operation is turned off, the map is reduced to a predetermined scale (b-c, S140 in FIG. 3).

  Specifically, when the touch operation is turned on (time point a), enlarged scrolling is started, and a scale (for example, 1/4000, 1/2000, 1 / 200, 1/5000, 1/2500, etc.), the map is scrolled while being enlarged stepwise (between a and b). For example, in the case of rightward scrolling, the screen changes as shown in FIGS. 8B-1 to 8B-3. When the scale reaches the upper limit that can be displayed, the map is scrolled while maintaining the scale. When the touch operation is turned off (time point b), the scrolling is stopped, and the map is gradually reduced from the map displayed at that time to the scale at the time when the enlargement is started (b to c). while). For example, when the enlargement scroll in the right direction is finished, the screen changes as shown in FIGS. 9 (B-4) to (B-5). When the original scale is reached, the map is displayed with the scale maintained (after time c).

  Note that the stepwise change of the scale is preferably performed in finer units (for example, at intervals of 10 m). In the above description, when the touch operation is turned off, the scrolling is stopped and the scale of the map is changed step by step up to the initial scale. However, the scale may not be changed. Further, when the map display at the position after the enlarged scroll or the reduced scroll is canceled and the map around the own vehicle position is displayed, for example, the instruction is accepted via the touch input detection device 13 or the input device 14. What should I do?

  The first embodiment of the present invention has been described above. According to this embodiment, it is possible to simultaneously scroll the map display and change the scale by a simple operation. In addition, the burden on the user can be reduced by a simple operation, and driving safety can be improved. In other words, by providing buttons or areas for enlargement scrolling or reduction scrolling on the display, when the user wants to change the scale of the map after scrolling, it is necessary to change the scale by an independent operation. Disappears. In addition, the user may scroll after determining whether to enlarge or reduce the map in advance according to the purpose such as detailed information about the point to be displayed by scrolling or roughly wanting to know. It is possible to obtain information desired by intuitive operation.

  Next, a modification of the first embodiment will be described with reference to the drawings. In this modification, an upper limit value (predetermined value) of a preset scale can be changed by a user operation during an enlargement or reduction scroll operation. Hereinafter, a description will be given focusing on differences from the first embodiment.

  FIG. 11 is a diagram for explaining operations of enlargement scroll and reduction scroll according to a modification. As in FIG. 5, the horizontal axis indicates the passage of time, and the vertical axis indicates a change in scale (enlargement / reduction) and a change in touch operation (ON, OFF).

  FIG. 11A shows a reduction scroll operation. As shown in this figure, when the touch operation is turned on (time point a), the reduction scroll is started, and the map is scrolled while being reduced (wide area display) step by step (between a and b). When the scale reaches a preset upper limit value (time point b), the map is scrolled while maintaining the scale (between bc). Here, when a predetermined touch operation, for example, an operation that turns on within a predetermined time after the touch operation is turned off (may be an operation like a double click) is detected ( c), the upper limit is changed to a scale that is one step smaller, and the map is further scrolled while being reduced (between c and d). The same applies to the time point d. When the scale reaches the set upper limit, the map is scrolled while maintaining the scale (between de and e). When the touch operation is turned off (at time e), the scrolling is stopped, and the map is enlarged step by step up to the scale at the time when the reduction is started centering on the map displayed at that time (ef). while). When the original scale is reached, the map is displayed with the scale maintained (after time f).

  FIG. 11B shows the operation of enlargement scrolling. As shown in the figure, when the touch operation is turned on (time point a), the enlarged scroll is started, and the map is scrolled while being enlarged stepwise (between a and b). When the scale reaches a preset upper limit value (time point b), the map is scrolled while maintaining the scale (between bc). Here, when a predetermined touch operation, for example, an operation that turns on within a predetermined time after the touch operation is turned off (may be an operation like a double click) is detected ( c), the upper limit is changed to a scale larger by one step, and the map is further scrolled while being enlarged (between c and d). The same applies to the time point d. When the scale reaches the set upper limit, the map is scrolled while maintaining the scale (between de and e). When the touch operation is turned off (at time e), the scrolling is stopped, and the map is gradually reduced from the map displayed at that time to the scale at the time when the enlargement is started (ef). while). When the original scale is reached, the map is displayed with the scale maintained (after time f).

  FIG. 10 is a flowchart showing a flow of processing of enlargement scroll and reduction scroll according to the modification. This flow starts when the navigation device 1 is activated. Hereinafter, a description will be given focusing on differences from the flow of FIG. Note that the same processes as those in FIG. 3 are denoted by the same reference numerals as those in FIG.

  As shown in this figure, while the map display is enlarged or reduced (when the scale reaches a predetermined value, only scrolling) is performed (S120), the user operation analysis unit 108 performs the user's touch operation. It is determined whether or not has been turned off (S130). If it is determined that the touch operation is not in the OFF state (NO in S130), the process proceeds to S120. On the other hand, if it is determined that the touch operation is in the OFF state (YES in S130), the process proceeds to S132.

  When it is determined that the touch operation is in the OFF state (YES in S130), the user operation analysis unit 108 determines whether or not the touch operation is in the ON state within a predetermined time (S132). If it is determined that the touch operation is not in the ON state (NO in S132), the process proceeds to S140. On the other hand, when it is determined that the touch operation is in the ON state (YES in S132), the process proceeds to S134.

  When it is determined that the touch operation is in the ON state (YES in S132), the main control unit 100 changes the predetermined value of the scale. Specifically, the main control unit 100 changes a preset scale value to a value larger or smaller by one step. Then, the process proceeds to S120. Note that when the process of S120 is completed, the predetermined value of the scale is reset to a predetermined default value or the like.

  In the above, the modification of the 1st Embodiment of this invention was demonstrated. According to this modification, in addition to the effects of the first embodiment, the user can change the scale during the enlargement scroll or the reduction scroll, and thus can obtain information desired to be obtained flexibly according to the situation. For example, the user can change the scale by a simple operation when he wants to display a wider area during the reduced scroll or when he wants to display a more detailed display during the enlarged scroll. Note that the user may be able to select the enlargement / reduction scroll operation of the first embodiment and this modification.

  Next, a second embodiment of the present invention will be described with reference to the drawings. In the second embodiment, the scale used by the user during the enlargement scroll or reduction scroll operation is accumulated as statistical data, and the optimum scale is preferentially set during the operation. The following description will focus on differences from the modification of the first embodiment.

  With reference to FIG. 1, the functional configuration of the navigation device 1 according to the present embodiment will be described.

  In the present embodiment, the control device 10 includes a scale learning unit 110 in addition to the functional configuration of the first embodiment described above.

  As shown in FIG. 12, the scale learning unit 110 includes a reduced scale DB 1101 and an enlarged scale DB 1102. Each DB stores the number of times set in the past in the enlargement scroll or the reduction scroll in association with each scale. These DBs are stored on the storage device 18 when they are continuously held regardless of whether the navigation device 1 is powered on or off, and are stored on the RAM when they are temporarily held.

  FIG. 13 is a flowchart showing a flow of processing of enlargement scroll and reduction scroll according to the present embodiment. This flow starts when the navigation device 1 is activated. Hereinafter, a description will be given focusing on differences from the flow of FIG. Note that processes similar to those in FIG. 10 are denoted by the same reference numerals as in FIG.

  As shown in this figure, when it is determined that the touch position is within the area for accepting enlargement scroll or reduction scroll (NO in S110), the main control unit 100 determines an optimal scale value from the scale learning unit 110. And set as a predetermined value of the scale (S112). Specifically, when the scale learning unit 110 receives a request from the main control unit 100, the scale learning unit 110 in the DB (enlarged scale DB1102 or reduced scale DB1101) corresponding to the scale (enlarged or reduced) indicated by the area to which the touch position belongs. Then, the scale having the largest number of times of setting is selected and notified to the main control unit 100. The main control unit 100 sets the acquired scale as a predetermined value, and proceeds to S120.

  With the predetermined value set as described above as the upper limit value, the touch operation is performed while the map display is enlarged or reduced (only scrolling when the scale reaches the predetermined value) (S120). When it is determined to be in the OFF state (YES in S130), the user operation analysis unit 108 determines whether or not the touch operation has been turned ON within a predetermined time (S132). If it is determined that the touch operation is not in the ON state (NO in S132), the process proceeds to S133. On the other hand, when it is determined that the touch operation is in the ON state (YES in S132), the process proceeds to S134.

  When it is determined that the touch operation is not in the ON state (NO in S132), the scale learning unit 110 sets the set number of times in the enlarged scale DB 1102 or the reduced scale DB 1101 corresponding to the scale value set as the predetermined value. Increase by one. Thereafter, the process proceeds to S140.

  The second embodiment of the present invention has been described above. According to this embodiment, the same effect as that of the first embodiment or its modification can be obtained. In addition, since an optimal value for the user is set as the upper limit value of the scale, the user can obtain the information that he / she wants to obtain without performing an operation to change the scale set during the enlargement scroll or reduction scroll operation. It becomes easy.

  The present invention has been described in connection with exemplary embodiments. Obviously, many alternatives, modifications, and variations will be apparent to practitioners skilled in this art. Accordingly, the above-described embodiments of the present invention are intended to illustrate and not limit the gist and scope of the present invention.

  For example, the present invention can be applied not only to an enlargement / reduction scroll of a planar map but also to an enlargement / reduction scroll of a stereoscopic map by bird's-eye view display. In bird's-eye view display, the depression angle can be changed according to the direction of change in scale of the reduction / enlargement scroll (enlargement or reduction). For example, in the case of reduction scrolling, the depression angle θ is increased as shown in FIGS. 14A to 14B, and in the case of enlargement scrolling, the depression angle θ is reduced as shown in FIGS. 14A to 14C. It can be configured to display a three-dimensional map. Further, the depression angle θ may be changed according to the scrolling time of the reduction / enlargement scroll. For example, in the case of reduced scrolling, the depression angle θ can be increased as the scrolling time becomes longer, and in the case of enlarged scrolling, the depression angle θ can be reduced as the scrolling time becomes longer to display a three-dimensional map. Further, the upper limit value of the depression angle may be changed by a predetermined operation during the enlargement / reduction scroll.

  Further, for example, a dedicated hard switch may be provided as an operation button for enlargement scroll or reduction scroll without using a touch panel.

The block diagram which shows the hardware constitutions and functional structure of a navigation apparatus. The figure for demonstrating the structure of map information. The flowchart which shows the flow of a process of expansion scroll and reduction scroll. The figure which shows the example of UI screen for a scroll and a scale change. The figure for demonstrating the operation | movement of expansion scroll and reduction scroll. The figure which shows the example of a screen transition of reduction scroll (the 1). The figure which shows the example of a screen transition of reduction scroll (the 2). The figure which shows the example of a screen transition of expansion scroll (the 1). The figure which shows the example of a screen transition of expansion scroll (the 2). The flowchart which shows the flow of the process of the expansion scroll and reduction scroll which concern on a modification. The figure for demonstrating the operation | movement of the expansion scroll and reduction scroll which concern on a modification. The figure for demonstrating the structure of a scale learning part. The flowchart which shows the flow of a process of the expansion scroll and reduction scroll which concern on 2nd Embodiment. The figure for demonstrating the setting of the depression angle in a bird's-eye view display.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 ... Navigation apparatus, 10 ... Control apparatus, 11 ... Display part, 12 ... Display, 13 ... Touch input detection apparatus, 14 ... Input device, 16 ... Voice input / output Device: 18 ... Storage device, 20 ... Various sensors, 22 ... GPS receiver, 24 ... FM multiplex broadcast / beacon receiver, 26 ... Communication device, 100 ... Main controller , 102 ... Current location calculation unit, 104 ... Route calculation unit, 106 ... Display processing unit, 108 ... User operation analysis unit, 110 ... Scale learning unit, 300 ... Map, 302 ..Current position mark, 310 ... enlargement scroll button, 312 ... reduction scroll button, 314 ... dotted line, 315 ... enlargement scroll area, 316 ... dotted line, 317 ... reduction scroll Area, 320 ... building mark, 1101 ... reduced scale DB, 1102 ... enlarged scale DB, 1800 ... map information, 1811 ... mesh ID, 1820 ... link data, 1821 ... Link ID, 1822 ... Coordinate information, 1823 ... Road type information, 1824 ... Link length information, 1825 ... Link travel time, 1826 ... Connection link ID, 1827 ... Number of lanes

Claims (10)

  1. A navigation device for guiding a user by displaying a map on a display,
    Detecting means for detecting a user operation;
    Display processing means for scrolling the map according to the content of the operation detected by the detection means, and changing the scale according to the length of the scroll time;
    A navigation device comprising:
  2. The navigation device according to claim 1,
    The display processing means includes
    Changing the scale of the map smaller as the scroll time is longer,
    A navigation device characterized by the above.
  3. The navigation device according to claim 1,
    The detection means includes
    Display a screen that accepts user operations, detect user operations including scroll direction and scale change direction,
    The display processing means includes
    Scrolling the map in the scroll direction detected by the detection means, and changing the scale in the direction of the scale change detected by the detection means, and displaying the map,
    A navigation device characterized by the above.
  4. The navigation device according to claim 3,
    The detection means includes
    A screen for receiving a user operation is provided with a first area for receiving a scale reduction and a scroll direction and a second area for receiving a scale increase and a scroll direction,
    The direction of the scale change in the area to which the touch position belongs and the scroll direction corresponding to the direction from the predetermined center position on the screen to the touch position are detected by the user's touch operation on any one of the areas. ,
    A navigation device characterized by the above.
  5. The navigation device according to claim 3,
    The detection means includes
    Based on a predetermined center position of the screen, a screen for accepting a user's operation is displayed in which a button for simultaneously accepting the direction of the scale change and the direction of the scroll is arranged at a position corresponding to the direction of the scroll,
    Detecting a scroll direction corresponding to a direction of a scale change determined for the button and a direction from a predetermined center position of the screen to the position of the button by a user's touch operation on the button;
    A navigation device characterized by the above.
  6. The navigation device according to any one of claims 4 and 5,
    The display processing means includes
    Changing the scale with the set upper limit as an upper limit according to the length of time of the touch operation detected by the detection means;
    A navigation device characterized by the above.
  7. The navigation device according to claim 6,
    The display processing means includes
    When a predetermined operation is detected while the touch operation is continuously detected by the detection means, the set upper limit value is updated.
    A navigation device characterized by the above.
  8. The navigation device according to claim 7,
    Scale storage means for storing the set number of times for each scale;
    The display processing means includes
    When changing the scale and starting scrolling according to the operation detected by the detection means, setting the scale acquired from the scale storage means as an upper limit value in advance,
    A navigation device characterized by the above.
  9. The navigation device according to claim 8,
    The display processing means includes
    Obtaining the scale having the largest number of set times from the scale storage means;
    A navigation device characterized by the above.
  10. A scroll method in a navigation device for guiding a user by displaying a map on a display,
    The navigation device
    A detection step for detecting a user operation;
    A display step of scrolling the map according to the content of the operation detected in the detection step, and displaying the scale by changing the scale according to the length of the scroll time;
    The scroll method characterized by performing.
JP2007272058A 2007-10-19 2007-10-19 Navigation device and scroll method Pending JP2009098086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007272058A JP2009098086A (en) 2007-10-19 2007-10-19 Navigation device and scroll method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007272058A JP2009098086A (en) 2007-10-19 2007-10-19 Navigation device and scroll method

Publications (1)

Publication Number Publication Date
JP2009098086A true JP2009098086A (en) 2009-05-07

Family

ID=40701223

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007272058A Pending JP2009098086A (en) 2007-10-19 2007-10-19 Navigation device and scroll method

Country Status (1)

Country Link
JP (1) JP2009098086A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010266961A (en) * 2009-05-12 2010-11-25 Sony Corp Information processor, information processing method and information processing program
JP2013511102A (en) * 2009-11-17 2013-03-28 クアルコム,インコーポレイテッド How to scroll items on a touch screen user interface
JP2013160529A (en) * 2012-02-01 2013-08-19 Clarion Co Ltd Information terminal, program and map display method
WO2014087523A1 (en) * 2012-12-06 2014-06-12 パイオニア株式会社 Electronic apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07280577A (en) * 1994-04-05 1995-10-27 Sumitomo Electric Ind Ltd Map scrolling method in navigation system
JPH10268759A (en) * 1997-03-21 1998-10-09 Sony Corp Electronic map display device
JPH11288455A (en) * 1998-04-02 1999-10-19 Ffc:Kk Image scroll display device and storage medium for storing image scroll display program
JPH11327433A (en) * 1998-05-18 1999-11-26 Denso Corp Map display device
JP2000180188A (en) * 1998-12-18 2000-06-30 Kenwood Corp Navigator
JP2002081942A (en) * 2000-09-06 2002-03-22 Kenwood Corp Navigator
JP2007065476A (en) * 2005-09-01 2007-03-15 Nissan Motor Co Ltd Map display apparatus and map display method
WO2007114067A1 (en) * 2006-04-06 2007-10-11 Pioneer Corporation Map display device and map display method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07280577A (en) * 1994-04-05 1995-10-27 Sumitomo Electric Ind Ltd Map scrolling method in navigation system
JPH10268759A (en) * 1997-03-21 1998-10-09 Sony Corp Electronic map display device
JPH11288455A (en) * 1998-04-02 1999-10-19 Ffc:Kk Image scroll display device and storage medium for storing image scroll display program
JPH11327433A (en) * 1998-05-18 1999-11-26 Denso Corp Map display device
JP2000180188A (en) * 1998-12-18 2000-06-30 Kenwood Corp Navigator
JP2002081942A (en) * 2000-09-06 2002-03-22 Kenwood Corp Navigator
JP2007065476A (en) * 2005-09-01 2007-03-15 Nissan Motor Co Ltd Map display apparatus and map display method
WO2007114067A1 (en) * 2006-04-06 2007-10-11 Pioneer Corporation Map display device and map display method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010266961A (en) * 2009-05-12 2010-11-25 Sony Corp Information processor, information processing method and information processing program
US8970630B2 (en) 2009-05-12 2015-03-03 Sony Corporation Information processing device, information processing method, and information processing program
JP2013511102A (en) * 2009-11-17 2013-03-28 クアルコム,インコーポレイテッド How to scroll items on a touch screen user interface
JP2013160529A (en) * 2012-02-01 2013-08-19 Clarion Co Ltd Information terminal, program and map display method
WO2014087523A1 (en) * 2012-12-06 2014-06-12 パイオニア株式会社 Electronic apparatus
US9971475B2 (en) 2012-12-06 2018-05-15 Pioneer Corporation Electronic apparatus

Similar Documents

Publication Publication Date Title
US9157758B2 (en) Navigation or mapping apparatus and method
DE60030591T2 (en) Navigation system with guides
JP4807452B2 (en) Navigation device
US20040107043A1 (en) Navigation method and system
CN102007374B (en) Navigation device
JP2008304325A (en) Navigation device and map scrolling processing method
US20080167798A1 (en) Navigation device and method for displaying navigation information
JP4290428B2 (en) Vehicle travel guidance device and vehicle travel guidance method
US7925438B2 (en) Method and apparatus for displaying route guidance list for navigation system
DE102010023944B4 (en) Navigation method for instructing an operator of a vehicle with a navigation system
EP2096413A1 (en) Method and apparatus for adjusting distance for generating maneuver instruction for navigation system
JP2006315597A (en) On-vehicle display device
JP5086562B2 (en) Navigation device
JP4948944B2 (en) Navigation device and method for drawing intersection guide map
JP2006039745A (en) Touch-panel type input device
JP2008209208A (en) Car navigation device
US7788028B2 (en) Navigation system
JP2007052397A (en) Operating apparatus
JPWO2005121706A1 (en) Map display device and map display method
US6859724B2 (en) Method of searching for guidance route in navigation device
JP2006078430A (en) Car navigation system
US8095308B2 (en) Navigation apparatus
WO2011026994A1 (en) Navigation apparatus, vehicle indication control apparatus, vehicle indication control system and method of controlling a directional indicator
JP2009300179A (en) Parking lot guiding device, parking lot guiding method, and program
JP4508912B2 (en) Navigation device

Legal Events

Date Code Title Description
A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20100215

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20101015

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120215

A131 Notification of reasons for refusal

Effective date: 20120221

Free format text: JAPANESE INTERMEDIATE CODE: A131

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120703