JP2010019643A - Information terminal, navigation apparatus, and option display method - Google Patents

Information terminal, navigation apparatus, and option display method Download PDF

Info

Publication number
JP2010019643A
JP2010019643A JP2008179421A JP2008179421A JP2010019643A JP 2010019643 A JP2010019643 A JP 2010019643A JP 2008179421 A JP2008179421 A JP 2008179421A JP 2008179421 A JP2008179421 A JP 2008179421A JP 2010019643 A JP2010019643 A JP 2010019643A
Authority
JP
Japan
Prior art keywords
display
option
information terminal
menu
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008179421A
Other languages
Japanese (ja)
Inventor
Takashi Hosoya
Yusuke Oku
雄介 奥
高史 細谷
Original Assignee
Toyota Motor Corp
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, トヨタ自動車株式会社 filed Critical Toyota Motor Corp
Priority to JP2008179421A priority Critical patent/JP2010019643A/en
Publication of JP2010019643A publication Critical patent/JP2010019643A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

To provide an information terminal, a navigation device, and an option display method capable of displaying a lower hierarchy without always displaying two hierarchies and inputting a decision by a user.
An information terminal that displays an option 200 in a lower hierarchy of an option 101 in an upper hierarchy corresponding to a contact position displayed on a display device 21 when an operation means contacts the contact position detection means 24 for detecting the contact position. 50, the sensor 23 that detects that the operating means 22 is close to the space in front of the touch panel 24, and the sensor 23 detects that the operating means 22 is close to the option 101 of the upper hierarchy. The display control means 31, 32, and 33 display the option 200 in the lower hierarchy of 101 on the display device 21.
[Selection] Figure 1

Description

  The present invention relates to an information terminal, a navigation apparatus, and an option display method that enable selection of a lower hierarchy of the option by determining an option of an upper hierarchy.

  Vehicle information terminals such as navigation devices store various information such as point data such as facility addresses, postal codes, telephone numbers, names, and telephone books. However, there are cases where the user does not know the information specifying the facility such as an address, and the display device of the vehicle information terminal is limited in size, so that it is easy to access desired information. For example, it is possible to perform a hierarchical search by sequentially narrowing down prefecture names, city names, town names, etc. while looking at the display screen.

However, when the display screen is generated for each hierarchy in this way, it may be difficult to grasp the relationship before and after the hierarchy. With regard to this point, a technique has been proposed in which the previous hierarchy is displayed together with the selected hierarchy on one display screen (see, for example, Patent Document 1). FIG. 11 shows an example of a display screen displayed by a conventional navigation device. FIG. 11A shows one display screen in which “Amusement Hotel” on the left side is selected and “Hotel A to Hotel F” is displayed on the right side as a list of amusement hotels. When the user selects “Hotel B” on the display screen of FIG. 11A, the right half of the display screen moves to the left as shown in FIG. Then, as shown in FIG. 11C, detailed information of “Hotel B” is displayed on the right half of the display screen.
JP-A-11-142171

  However, the navigation device described in Patent Document 1 has a problem in that it is necessary to operate (contact) the display screen in order to display lower layers. For example, in FIG. 11A, in order to display the “Minshuku” menu, the “half guesthouse” in the left half (even if it was originally in the right half) must be selected. In other words, the display screen of the lower hierarchy cannot be displayed unless the user's “decision” to be selected is “input”.

  Further, the navigation device described in Patent Document 1 has a problem that the number of user operations increases when the hierarchy is deeper. This navigation device displays the menu list of the upper hierarchy and the lower hierarchy of any one of the upper hierarchies on one display screen, but until “input” of “decision decision” The menu list of the lower hierarchy as viewed from the lower hierarchy (right half) cannot be displayed. FIGS. 11D to 11G show transition examples of display screens when tracing a lower hierarchy of addresses. For example, when the user selects “Kanto” → “Tokyo” → “Minato Ward” in this order, the display screen of FIG. However, if the user notices that there is no target place on the display screen of FIG. 11G, it may be necessary to “return” to the display screen of FIG. 11F (the left half of FIG. 11G). If you can't choose from). In other words, even if the upper and lower levels are displayed on one display screen, if the level becomes deeper, only two levels of “decision” and “return” are required. It is the same as a navigation device that cannot.

  In addition, the navigation device described in Patent Document 1 has a problem that the display screen is occupied by the menu because two layers are always displayed, and the user must select from many options when viewed from the user. The operability may be reduced.

  SUMMARY OF THE INVENTION The present invention has been made in view of the above problems, and it is an object of the present invention to provide an information terminal, a navigation device, and an option display method capable of displaying a lower hierarchy without always displaying two hierarchies and inputting a decision by the user. And

  In view of the above problems, the present invention provides information in which, when an operation unit comes into contact with a contact position detection unit that detects a contact position, a lower-layer option of an upper-layer option corresponding to the contact position displayed on the display device is displayed. A terminal that detects that the operating means is close to the space in front of the contact position detecting means, and if the sensor detects that the operating means is close to an option on the upper hierarchy, the lower hierarchy of the option Display control means for displaying the above options on a display device.

  It is possible to provide an information terminal, a navigation device, and an option display method capable of displaying a lower hierarchy without always displaying two hierarchies and inputting a decision by the user.

  Hereinafter, the best mode for carrying out the present invention will be described with reference to the accompanying drawings.

  FIG. 1 shows an example of a display screen displayed on the display 21 of the information terminal 50. The display 21 of the information terminal 50 according to the present embodiment includes a so-called touch panel 24, which detects a contact position where a user's finger or a rod-shaped pointing device owned by the user (hereinafter referred to as an operation means 22) is in contact. To do. Operation information is input to the information terminal 50 according to the menu displayed at the contact position. In addition to the touch panel 24, the information terminal 50 includes a touch position prediction sensor 23 that detects the position of the operation unit 22 before touching the display 21. The touch position prediction sensor 23 detects the operation means 22 within a predetermined distance from the display 21 together with the position.

  In FIG. 1A, menus 101a to 101d (hereinafter referred to as menu 101 if not distinguished) are displayed in menu list 100. When the user brings the operation unit 22 close to any of the menus 101a to 101d included in the menu list 100, the information terminal 50 displays the menu list 200 at a lower hierarchy of the menu 101 that the operation unit 22 is trying to touch. For example, in FIG. 1B, a menu list 200a in the lower hierarchy of the menu 101c “genre” is displayed, and in FIG. 1C, a menu list 200b in the lower hierarchy of the menu 101b “memory” is displayed. Yes. If the operation means 22 is separated from the menu list 100, the lower-level menu list 200a or 200b (the menu list 200 if not distinguished) is not displayed.

  Accordingly, the user does not input operation information to the touch panel 24 as decision making (without touching), and only when the operation means 22 approaches while the upper hierarchy menu list 100 is always displayed, the lower hierarchy menu. A list 200 can be displayed.

  The user can visually recognize the lower-level menu list 200 while moving the operation means 22 on the display 21 without touching the menu list 100. When the user finds the desired lower-level menu list 200, the higher-level menu list 200 is displayed. The operation means 22 is brought into contact with the menu 101. For example, in FIG. 1C, when the operation unit 22 touches the menu 101b “memory”, the display screen changes as shown in FIG. 1D, and the menus 200b1 to 200b1 in the lower-level menu list 200b are displayed. 200b3 is displayed to be selectable (contactable).

  In this way, after the user confirms the lower-level menu list 200, the operation information is input by bringing the operating means 22 into contact with the upper-level menu 101, so that operation errors can be reduced. In addition, since the lower-level menu list 200 is not always displayed, the display screen is not occupied by the upper and lower-level menu lists 100 and 200. In addition, since the displayed lower-level menu list 200 can be limited to a minimum, the user is not confused or the operability is not lowered.

  In the following, the decision input by the user is expressed as the operation means 22 touching the touch panel 24 (strictly speaking, the fact that the operation means 22 has approached the touch panel 24 is also the operation information input). It is expressed that operation information is input when the means 22 touches the touch panel 24. Further, the detection of the operation means 22 by the touch position prediction sensor 23 is simply expressed as that the operation means 22 approaches the touch panel 24. In addition, simply “menu” refers to the entire menu.

  FIG. 2 shows an example of a hardware configuration diagram of the information terminal 50. Since the information terminal 50 can be configured integrally with, for example, a navigation system, the information terminal 50 is controlled by a navigation ECU (Electronic Control Unit) 20 in the present embodiment, but may be controlled by another ECU. In addition, the information terminal 50 does not need to be mounted on the vehicle, and includes, for example, a portable PND (Personal Navigation Device), a PDA (Personal Digital Assistant), a mobile phone, and the like, and can be used as appropriate.

  The navigation ECU 20 includes a GPS (Global Positioning System) receiver 27, a touch position prediction sensor 23, a display 21, a touch panel 24, via an in-vehicle LAN such as a CAN (Controller Area Network) and a LIN (Local Interconnect Network) and a dedicated line. It is connected to a map DB (Data Base) 25, a communication device 26, and the like.

  The GPS receiver 27 detects the current position (latitude / longitude / altitude) of the vehicle based on the arrival time of radio waves transmitted from GPS satellites. The navigation ECU 20 accumulates the travel distance in the traveling direction from the position detected by the GPS receiver 27 and estimates the position of the vehicle with high accuracy. In the map DB 25, nodes that divide roads at intersections and at predetermined intervals are stored in association with latitudes and longitudes, and a road network is expressed by connecting the nodes with links corresponding to the roads. The map DB 25 stores the point data positions of facilities such as gas stations, parking lots, and public facilities in association with latitude and longitude.

  The communication device 26 is connected to a data server of a carrier such as a mobile phone via a base station of a mobile phone or a wireless LAN access point, for example, and a network such as the Internet according to a predetermined communication protocol (for example, IP, TCP) Various information is transmitted / received to / from a server outside the vehicle via the. Thereby, for example, information on facilities around the vehicle, traffic information, map information, and the like can be acquired. In addition to this, a function of reproducing music such as television broadcast, radio broadcast, CD, and video such as DVD may be provided.

  The display 21 is an HMI (Human Machine Interface) that displays a display screen and includes a touch panel 24 for detecting a contact position of the operation means 22. The display 21 is a flat panel display such as a liquid crystal or an organic EL housed in a center cluster, for example, and may further include a HUD (Head Up Display). Further, in addition to the touch panel 24, an operation information input means by a keyboard or voice input may be provided. The touch panel 24 detects the contact position of the operation means 22 by using, for example, a transparent electrode arranged in a planar shape and detecting an electric signal corresponding to the contact position by a resistance film method or a capacitance method. The touch position detected by the touch panel 24 is not limited to one at the same time, and two or more touch positions can be detected at the same time.

  Moreover, although the example in which the display 21 and the touch panel 24 are integrated is shown, the touch panel 24 may be separated or detachable, and the contact position of the touch panel 24 may be transmitted to the information terminal 50 by wire or wirelessly. In this case, the touch position prediction sensor 23 described below is provided on the touch panel 24 side, and similarly, a lower-level menu can be displayed on the display 24 without touching the touch panel 24.

[Touch position prediction sensor 23]
The touch position prediction sensor 23 will be described. FIG. 3A is an example of a diagram for schematically explaining the outline of the touch position prediction sensor 23. The touch position prediction sensor 23 detects the position of the operating means 22 (hereinafter referred to as a pre-touch position) that is close to the touch panel 24 or within a predetermined distance from the touch panel 24 by, for example, an infrared light shielding method. This predetermined distance may not be uniform on the surface of the touch panel 24.

  The touch position prediction sensor 23 has a plurality of infrared projectors 23a arranged linearly in the longitudinal direction and the short direction of the display 21, respectively, and arranged linearly in the opposing longitudinal direction and the short direction. It has a plurality of light receiving sensors 23b. The infrared projector 23a and the light receiving sensor 23b are respectively arranged at positions away from the display 21 by several cm to 10 cm. If there is no light blocking object, the light irradiated by the infrared projector 23a is detected by the light receiving sensor 23b, but when the operation means 22 is present, the infrared light is blocked. For this reason, the pre-touch position of the operation means 22 is specified as a position where the light receiving sensor 23b in the longitudinal direction and the short direction cannot detect infrared rays.

  Further, if the infrared projector 23 a and the light receiving sensor 23 b are stacked in the direction perpendicular to the display 21, the position of the operating means 22 in the direction perpendicular to the display 21 can be detected.

  FIG. 3B shows another example of the touch position prediction sensor 23. The touch position prediction sensor 23 in FIG. 3 incorporates an optical sensor 23c in each pixel of the display 21, and photographs an object on the display 21 for each pixel. Although the luminance information detected by the optical sensor 23c may be determined by the luminance and color information of each pixel, the luminance information detected by the optical sensor 23c changes depending on the image of the operating means 22 when the operating means 22 approaches. Therefore, it can be detected that the operating means 22 is present at the position of the optical sensor 23c where the luminance information has changed. Note that the operation means 22 may be photographed by a camera provided outside the display 21 and the position thereof may be detected.

  In order to improve the detection accuracy of the operation means 22, a near-infrared projector that illuminates the tip of the operation means 22 may be arranged, and the sensitivity of the optical sensor 23c may be in the near-infrared wavelength region. Further, display control for interrupting display on the display screen for only a very short time may be repeated every cycle time, and the operation means 22 may be photographed while the display screen is not displayed.

  In the case of the optical sensor 23c, for example, the size of the operation means 22 when it touches the touch panel 24 is detected, and the vertical direction of the display 21 is determined from the change in the size of the image with reference to the size. The position can be detected. In any of the modes shown in FIGS. 3A and 3B, the positions of the two operating means 22 can be detected simultaneously.

[Details of information terminal 50]
FIG. 4 shows an example of a functional block diagram of the information terminal 50. The navigation ECU 20 is a form of a computer including a CPU, an ASIC (Application Specific Integrated Circuit), an input / output interface, a register, a memory, a switch element (IGBT (Insulated Gate Bipolar Transistor), etc.), a CAN interface, and the like. Then, the CPU executes a program stored in the memory, or is realized by hardware such as an ASIC, the next display screen determination unit 31, the display screen processing unit 32, the input / display unit 33, and the touch prediction position detection. Part 34. A display screen information storage unit 36 and a system state information storage unit 35 are mounted on a storage medium such as a hard disk drive or a flash memory.

  The touch prediction position detection unit 34 detects a contact position (hereinafter referred to as a touch prediction position) when the touch panel 24 is touched from the pre-touch position detected by the touch position prediction sensor 23. In many cases, the pre-touch position and the predicted touch position may coincide with each other on a plane parallel to the display 21. This is because the user often causes the operation means 22 to approach the touch panel 24 in the vertical direction from directly above the menu 101, so that the pre-touch position becomes the contact position of the touch panel 24 as it is (even if it slightly fluctuates).

  On the other hand, the predicted touch position can be detected from the movement of the operation means 22 (change in the pre-touch position). For example, when the operation means 22 moves horizontally with respect to the display 21, the predicted touch position can be predicted in the moving direction. In FIG. 1A, when the operation means 22 moves from the menu 101a to the menu 101b, the predicted touch position is the position of the menu 101b even if the pre-touch position of the operation means 22 does not reach the menu 101b position completely. Can be detected. That is, the position of the menu 101 closest to the moving direction can be set as the predicted touch position.

  In addition, when the operation unit 22 moves away from the display 21 in the vertical direction, the predicted touch position detection unit 34 does not detect the predicted touch position even in the detection range of the touch position prediction sensor 23. Can do. Thereby, the lower-level menu list 200 can be deleted from the display screen at an early stage.

  Further, in the present embodiment, it is only necessary to detect which menu 101 is operated by the operation unit 22. Therefore, when the operation unit 22 approaches the detection range of the touch position prediction sensor 23, the position of any of the menus 101 a to 101 d is changed. It is good also as a touch prediction position. For example, the position of the menu 101 that is closest to the operation means 22 can be set as the predicted touch position. In this case, since the predicted touch position of the operation unit 22 can be limited to the range of the menu 101, the predicted touch position can be easily detected by the predicted touch position detection unit 34. Note that the predicted touch position detection unit 34 acquires position information of the menu 101 (for example, position information of four vertices of the rectangular menu 101) from the display screen processing unit 32 described later.

  The input / display unit 33 acquires user operation information based on the contact position detected by the touch panel 24 and the position of the menu 101 displayed when the operation unit 22 touches the touch panel 24. For example, when the contact position is detected on the menu 101b in the state where the “memory” menu 101b of FIG. 1 is displayed, the input / display unit 33 acquires operation information corresponding to “memory”. In this case, the input / display unit 33 requests the display screen processing unit 32 to display the menu list 200 in the lower hierarchy of “memory”. In addition to the transition of the display screen, when the operation information requests an operation of the information terminal 50, the input / display unit 33 performs an operation according to the operation information, for example, road map display, acquisition of the current location, destination The navigation ECU 20 is requested to execute a function such as route search up to.

  The input / display unit 33 generates system state information and stores it in the system information storage unit 35. The system state information is information representing the current state of the information terminal 50. In the present embodiment, in particular, since the information on the display screen currently displayed on the display 21 is system state information, the system display information storage unit 35 stores a screen number for identifying the display screen. A display screen that can transition from the display screen can be traced by the screen number.

  The display screen processing unit 32 refers to the display screen information storage unit 36 and displays only the menu list 100 at the current level or the menu list 100 at the current level and the menu list 200 at the lower level. Is generated.

  The displayed information has a layer structure, and the display screen processing unit 32 generates, for example, an alley map layer and a menu list 100 layer, respectively, and the input / display unit 33 superimposes them. Display the display screen.

  FIG. 5A is a diagram showing an example of screen configuration information stored in the display screen information storage unit 36. In the screen configuration information, a graphic number, position information at a predetermined position such as the upper left or center, and character information included in the menu are registered for each menu in association with the screen number of the display screen. The figure number is information for designating the shape of the menu. As shown in FIG. 5B, shape information of each figure number is registered in the display screen information storage unit 36. Color information for determining the size and color for specifying the shape of the menu is registered for each figure number. FIG. 5B shows a case where the menu has a square shape, but a circular or elliptical menu can be specified in the same manner.

  The character information in FIG. 5A is characters and symbols displayed in the menu lists 100 and 200. In addition, it is preferable that character color information and size (number of points) are included.

  In the display screen information storage unit 36, screen transition information in which screen numbers of lower layers are associated with each screen number and menu number is registered. FIGS. 6A to 6C show examples of screen transition information. FIG. 6A shows screen numbers of lower-level display screens associated with menus 1 to 4 included in the display screen with screen number 1. FIG. 6B shows the screen numbers of the lower-level display screens associated with the menus 1 to 4 included in the display screen with the screen number 4.

  For example, when the operation means 22 touches “menu 3 on the display screen of screen number 1”, it can be understood that a display screen of screen number 4 may be generated as a lower-layer display screen. Similarly, when the operation means 22 touches “menu 4 on the display screen with screen number 4”, it can be seen that a display screen with screen number 6 may be generated as a lower-layer display screen.

  As shown in the screen configuration information of FIG. 5A, when the menus 1 to 4 with the screen number 1 correspond to the display screen of FIG. 1A, the operation means 22 is set to the “genre” menu 101c (menu 3). Contact), “eating”, “buying”, “staying”, and “playing” in the lower hierarchy are displayed.

  The next display screen determination unit 31 determines the menu number to which the operation unit 22 has approached based on the screen number of the display screen currently displayed on the display 21 registered in the system state information and the predicted touch position. Then, the screen number of the lower-level display screen is determined by referring to the screen transition information based on the determined menu number. The procedure for determining the screen number of the lower-level display screen is the same as when the operation means 22 touches the touch panel 24.

  For example, when the predicted touch position of the operation unit 22 is “menu 3 of screen number 1”, the next display screen determination unit 31 determines the display screen of screen number 4 to be a lower-layer display screen. In this case, as shown in FIG. 1B, a menu list 200a of “eat” “buy” “stay” “play” is displayed.

  The display screen of the lowest hierarchy will be described. As shown in FIG. 6C, the screen number of the lower hierarchy is not associated with the display screen of the lowest hierarchy. In FIG. 6C, the screen number Z corresponds to the display screen of the lowest layer. For example, when the operation means 22 touches the menus 1 to 4 with the screen number Z, the navigation ECU 20 displays detailed information on the menu followed by the user. To do. The detailed information includes, for example, the telephone number “01-2345-6789” displayed by tracing from the “telephone number” menu 101d, the address and telephone number of the accommodation facility, and the operation screen of the air conditioner.

  For example, when the operation means 22 comes into contact with the telephone number “01-2345-6789”, the navigation ECU 20 makes a call to this number and changes the display screen to the calling one. The display screen during calling is determined in advance. After making a call (or during a call), for example, the display screen transitions to the original display screen of screen number Z or a predetermined display screen, and thereafter the display screen is similarly changed according to the screen transition information. be able to.

  By the way, in the menu list 100 displayed so as to be selectable (contactable) by the operation means 22 and the menu list 200 in the lower hierarchy, it is preferable that the display mode such as size, brightness, color, etc. is changed and displayed. Excellent visibility and improved operability. Therefore, when the screen number of the lower layer display screen is notified from the next display screen determination unit 31, the display screen processing unit 32 reduces each menu, reduces the number of character points, or decreases the brightness. To display. As a result, it is possible to prevent the user from accidentally touching the operation means 22 with the lower-level menu list 200 or the display 21 being occupied by the lower-level menu list 200.

[Operation procedure of information terminal 50]
FIG. 7 is a flowchart showing an example of the operation procedure of the information terminal 50. The flowchart in FIG. 7 starts when, for example, the information terminal 50 is switched on. The display screen processing unit 32 reads the screen configuration information of a predetermined top-level display screen and generates a display screen (S10). The input / display unit 33 displays the display screen on the display 21. The system state information storage unit 35 stores the screen number of the current display screen.

  While the display screen is displayed on the display 21, the touch position prediction sensor 23 continuously detects the pre-touch position, and the touch prediction position detection unit 34 detects the touch prediction position from the pre-touch position (S20).

  The navigation ECU 20 determines whether or not the predicted touch position is detected (S30). If the predicted touch position is not detected (No in S30), the operation means 22 is not in the detection range, so in step S20. Continue to detect the predicted touch position.

  When the predicted touch position is detected (Yes in S30), the menu number to be touched is determined based on the screen number of the current display screen registered in the system state information and the predicted touch position, and based on the menu number The screen number of the lower-level display screen is determined with reference to the screen transition information (S40).

  Since the screen number of the lower-level display screen may not be registered in the screen transition information, the display screen processing unit 32 determines whether there is a lower-level menu list 200 to be displayed (S50). When there is no menu list 200 to be displayed (No in S50), the display screen processing unit 32 returns to Step S20 and repeats the detection of the pre-touch position.

  When there is the menu list 200 to be displayed (Yes in S50), the display screen processing unit 32 reads the screen configuration information of the display screen from the display screen information storage unit 36, and displays the display screen of the lower-level menu list 200. Generate (S60). Then, the input / display unit 33 displays the lower-level menu list 200 on the current display screen (S70). A new layer may be superimposed, or the current display screen may be updated with a display screen including the menu list 100 and the lower-level menu list 200.

  After step S70 or until step S70, the input / display unit 33 determines whether or not the operating means 22 has touched the touch panel 24 (S80). When the operation unit 22 touches the touch panel 24 (Yes in S80), the input / display unit 33 notifies the display screen processing unit 32 of the contact position, so the display screen processing unit 32 sets the contact position based on the screen transition information. A screen number of a display screen of a lower hierarchy of a certain menu 101 is determined, and a display screen of the screen number is generated (S90). Then, the input / display unit 33 updates the display screen (S100). The information terminal 50 repeats the above processing.

[Example of next screen display]
・ Higher level menu list 100
Up to now, the menu list 200 of the lower hierarchy has been displayed, but the menu list 100 of the upper hierarchy can be displayed using the predicted touch position. FIG. 8A shows an example of a display screen on which the higher-level menu list 100 is displayed. When the operation unit 22 touches the “memory” menu 101b in FIG. 1C, FIG. The display screen shown is displayed. In the display screen of FIG. 1D, the upper hierarchy “50 sounds”, “memory”, “genre”, and “phone number” are not displayed, but when the predicted touch position of the operation means 22 matches the position of the “return” menu 102, A menu list 100 in the upper hierarchy of the current display screen is displayed.

  The display of the menu list 100 of the upper hierarchy can be realized by the next display screen determining unit 31 tracing the screen transition information of FIGS. 6A to 6C from the lower hierarchy to the upper hierarchy. In this case, the predicted touch position is the position of the “return” menu 102, and when “return” is detected based on the predicted touch position, the next display screen determining unit 31 performs display screen processing so as to generate the menu list 100 of the upper hierarchy. Request to unit 32.

  Note that the current display screen in FIG. 8A is a screen in which the operation unit 22 has been brought into contact with the upper-level “memory” menu 101b and transitioned, and therefore, “memory” in the upper-level menu list 100 is highlighted. May be. In FIG. 8A, the “memory” is surrounded by a double line, but can be emphasized by increasing the brightness, blinking, or the like. This makes it easier for the user to grasp the menu 101b with which the operation means 22 has contacted in the upper hierarchy.

-A menu list 300 of a lower hierarchy below the lower hierarchy
Although the menu list 200 of the lower hierarchy is displayed in FIG. 1, a menu list 300 of a lower hierarchy (hereinafter referred to as a grandchild hierarchy) of the lower hierarchy can be displayed. FIG. 8B shows an example of a display screen on which the grandchild menu list 300 is displayed. The menu list 200 in the lower hierarchy of the “genre” menu 101c is “eat”, “buy”, “stay”, and “play”. The menu list 300 in the lower hierarchy of “play” is, for example, “amusement park”, “aquarium”, “zoo”, “professional baseball”.

  Note that which menu 200a1 to 200a4 of the lower layer menu list 200a the user should contact with the operation means 22 is not input to the next display screen determination unit 31, and therefore which lower layer menu list 300 of the menus 200a1 to 200a4 is selected. It is unclear whether it should be displayed. Therefore, for example, the next display screen determination unit 31 displays the menu list 300 of the lower hierarchy of the menus 200a1 to 200a4 sequentially from the right or left of the menu list 200a of the lower hierarchy.

  Accordingly, the user can easily grasp the three-level menu lists 100, 200, and 300 without touching the touch panel 24 once, and the operability can be improved.

  If the display of the lower-level menu list 200 and the grand-level menu list 300 is started at the same time, the user may be confused. For example, when a predetermined time elapses after the lower-level menu list 200 is displayed, It is preferable to display the menu list 300. If the grandchild menu list 300 is displayed with a time difference, the user can easily grasp the relationship between the lower hierarchy and the grandchild hierarchy.

  The menu list 300 of the grandchild hierarchy can be realized by the next display screen determination unit 31 tracing the screen transition information of FIGS. 6A to 6C from the upper hierarchy to the lower hierarchy. That is, it is realizable with the same structure as the structure which displays to a lower hierarchy.

-Display of operation content The menu list 200 has been described above as an example, but the information terminal 50 of the present embodiment can display the operation content of the information terminal 50 when the menu 101 is operated, for example. The operation content of the information terminal 50 includes, for example, “calling”, “adjusting the temperature of the air conditioner”, “operating indoor lighting”, “playing music”, and the like.

  FIG. 9A shows an example of the operation content when “calling”, and FIG. 9B shows an example of the operation content when “adjusting the temperature of the air conditioner”. When the predicted touch position is, for example, the position “Mr. B”, the next display screen determination unit 31 displays the operation content 401 of the information terminal 50 when the operation unit 22 contacts the telephone number “Mr. B”. The operation content 401 is stored in the display screen information storage unit 36 in association with, for example, a screen number and a menu number.

  For example, when the operation means 22 comes into contact with the telephone number, the information terminal 50 calls “Mr. B”, and therefore, the operation content 401 “make a call” is displayed in FIG. 9A.

  Further, as shown in FIG. 9B, when the predicted touch position is, for example, the position of the “△” icon that means increasing the set temperature, the next display screen determining unit 31 has the operation means 22 that “△” The operation content 402 of the information terminal 50 when “ For example, when the operation means 22 touches “△” on the setting screen of the air conditioner, the information terminal 50 increases the set temperature, so that “set temperature 23 ° C. → 23 ° C. + 0.5 ° C.” in FIG. 9B. The operation content 402 is displayed.

  When the user follows the lower hierarchy, the user may misunderstand that there is a menu further below the lowest hierarchy menu. In this case, since the information terminal 50 operates according to the menu operation, an operation for canceling the operation is newly required. However, the information terminal 50 according to the present embodiment displays the operation content only by bringing the operation means 22 close to the menu, so that the operation for inputting and canceling the operation can be reduced.

-Operational contents displayed when menu is not displayed Up to now, the menu list 200 or the operation contents are displayed by bringing the operation means 22 close to the menu displayed visually with the naked eye. May not be visible. FIG. 10A shows an example of a road map display screen displayed together with selectable menus 500a to 500c (hereinafter referred to as menu 500 when not distinguished). In FIG. 10A, a menu 500c that can be operated on the road map display screen is always displayed.

  As described above, if the information terminal 50 of this embodiment is applied to the road map display screen of FIG. 10A, the operation means 22 is brought close to the menu 500, so that Menus and operation details can be displayed. In FIG. 10A, an operation content 403 “Register your current location” is displayed for the “Register location” menu 500b.

  On the other hand, as shown in FIG. 10B, the operation content 403 can be displayed even if the menu 500 is not displayed. When the operating means 22 approaches the same position as in FIG. 10A, the operation content 403 “Register current location” of “Point registration” is displayed. The menu 500b may be displayed together with the display of the operation content 403.

  Therefore, since the menu 500 can be hidden, the display area of the road map can be widened, and the operation content is displayed when the user brings the operation means 22 close to the display 21. Therefore, the operation method becomes unknown. Can also be prevented. In FIG. 10, the operation content has been described as an example, but the present invention can also be applied to a lower-level menu.

  Since whether or not the menu 500 is displayed can be set by the user, a user who has fully grasped how to use the information terminal 50 may hide the menu, and a user who has just started using the menu 500 Can be displayed. Further, for example, the number of operations of the menu 500 by the user may be counted for each display screen, and the menu may be hidden even if there is no user setting when the number of operations is sufficiently large.

  As described above, the information terminal 50 according to the present embodiment can display the lower or upper menu list 100, 200, 300 only by the operation unit 22 approaching the display 21, so that the operation unit 22 is brought into contact with the menu. Or “return” operation is not required, and operability can be greatly improved. In addition, since the operation means 22 displays only the menu list 100 of the adjacent menu, there is no possibility of giving unnecessary information to the user.

It is a figure which shows an example of the display screen displayed on the display of an information terminal. It is an example of the hardware block diagram of an information terminal. It is an example of the figure which illustrates the outline of a touch position prediction sensor typically. It is an example of the functional block diagram of an information terminal. It is a figure which shows an example of the screen structure information memorize | stored in the display screen information storage part. It is a figure which shows an example of screen transition information. It is a flowchart figure which shows an example of the operation | movement procedure of an information terminal. It is a figure which shows an example of the display screen on which the menu list of the upper hierarchy was displayed. It is a figure which shows the example of a display of the operation content of an information terminal. It is a figure which shows an example of the display screen of the road map displayed with the menu. It is a figure which shows an example of the display screen which the conventional navigation apparatus displays.

Explanation of symbols

DESCRIPTION OF SYMBOLS 21 Display 22 Operation means 23 Touch position prediction sensor 24 Touch panel 31 Next display screen determination part 32 Display screen processing part 33 Input / display part 34 Touch prediction position detection part 35 System state information storage part 36 Display screen information storage part 50 Information terminal

Claims (7)

  1. When the operating means comes into contact with the contact position detecting means for detecting the contact position, the information terminal displays the lower hierarchy options of the upper hierarchy options corresponding to the contact positions displayed on the display device,
    A sensor for detecting that the operating means is close to the space in front of the contact position detecting means;
    When it is detected by the sensor that the operation means is close to an option on an upper hierarchy, display control means for displaying an option on a lower hierarchy of the option on the display device;
    An information terminal comprising:
  2. When it is detected by the sensor that the operation means is close to a predetermined option, the display control means displays an upper-layer option on the display device.
    The information terminal according to claim 1.
  3. When it is detected by the sensor that the operation means is close to the option of the lowest hierarchy, the display control means displays the operation content when the option is touched instead of the lower hierarchy on the display device.
    The information terminal according to claim 1.
  4. Options displayed on the display device are invisible.
    The information terminal according to claim 1 or 3, characterized by the above.
  5. The information terminal is mounted on the vehicle, or at least can be mounted on the vehicle.
    The information terminal according to claim 1, wherein the information terminal is an information terminal.
  6. Information terminal according to any one of claims 1 to 5,
    Position detection means for detecting the current position of the vehicle, and the road map is displayed together with the current position of the vehicle on the display device.
    A navigation device characterized by that.
  7. When the operation means contacts the contact position detection means for detecting the contact position, an option display method for displaying the lower hierarchy options of the upper hierarchy options corresponding to the contact positions displayed on the display device,
    A sensor detecting that the operating means approaches a space in front of the contact position detecting means;
    When the sensor detects that the operation means is close to an option on the upper hierarchy, the display control means displays the option on the lower hierarchy of the option on the display device;
    An option display method characterized by comprising:
JP2008179421A 2008-07-09 2008-07-09 Information terminal, navigation apparatus, and option display method Pending JP2010019643A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008179421A JP2010019643A (en) 2008-07-09 2008-07-09 Information terminal, navigation apparatus, and option display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008179421A JP2010019643A (en) 2008-07-09 2008-07-09 Information terminal, navigation apparatus, and option display method

Publications (1)

Publication Number Publication Date
JP2010019643A true JP2010019643A (en) 2010-01-28

Family

ID=41704710

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008179421A Pending JP2010019643A (en) 2008-07-09 2008-07-09 Information terminal, navigation apparatus, and option display method

Country Status (1)

Country Link
JP (1) JP2010019643A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011134272A (en) * 2009-12-25 2011-07-07 Sony Corp Information processor, information processing method, and program
JP2012008954A (en) * 2010-06-28 2012-01-12 Brother Ind Ltd Input device, compound machine and input control program
JP2012138012A (en) * 2010-12-27 2012-07-19 Sony Corp Display control device and method
WO2013063372A1 (en) * 2011-10-26 2013-05-02 Google Inc. Detecting object moving toward or away from a computing device
JP2013520727A (en) * 2010-02-19 2013-06-06 マイクロソフト コーポレーション Off-screen gestures for creating on-screen input
CN103593045A (en) * 2012-08-14 2014-02-19 富士施乐株式会社 Display control device, image forming apparatus, and display control method
JP2014092988A (en) * 2012-11-05 2014-05-19 Ntt Docomo Inc Terminal device, screen display method, hover position correction method, and program
JP2014232475A (en) * 2013-05-30 2014-12-11 京セラドキュメントソリューションズ株式会社 Display device, electronic apparatus, and image forming apparatus
JP2015028807A (en) * 2014-10-01 2015-02-12 株式会社リコー Operation display device and method
JP2016048557A (en) * 2015-09-18 2016-04-07 古野電気株式会社 Vessel information display device, vessel information display method and vessel information display program
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
JP2016099655A (en) * 2014-11-18 2016-05-30 コニカミノルタ株式会社 Image forming apparatus, control method of the same, and control program of the same
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411498B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
JP2017500673A (en) * 2013-10-01 2017-01-05 クアンタム インターフェイス エルエルシー Selective attraction interface, systems and devices having such an interface, and methods of making and using the same
JP2017037680A (en) * 2016-11-21 2017-02-16 富士ゼロックス株式会社 Display control device and program
US9671930B2 (en) 2011-04-27 2017-06-06 Furuno Electric Co., Ltd. Cross-application information display device, information display method for harmonizing display positions of menu items
JP2017142806A (en) * 2017-02-24 2017-08-17 富士ゼロックス株式会社 Display control device and program
US9946383B2 (en) 2014-03-14 2018-04-17 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US9946370B2 (en) 2005-12-30 2018-04-17 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US10503359B2 (en) 2012-11-15 2019-12-10 Quantum Interface, Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019080B2 (en) 2005-12-30 2018-07-10 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9952718B2 (en) 2005-12-30 2018-04-24 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9946370B2 (en) 2005-12-30 2018-04-17 Microsoft Technology Licensing, Llc Unintentional touch rejection
JP2011134272A (en) * 2009-12-25 2011-07-07 Sony Corp Information processor, information processing method, and program
US10282086B2 (en) 2010-01-28 2019-05-07 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9857970B2 (en) 2010-01-28 2018-01-02 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411498B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
JP2013520727A (en) * 2010-02-19 2013-06-06 マイクロソフト コーポレーション Off-screen gestures for creating on-screen input
US10268367B2 (en) 2010-02-19 2019-04-23 Microsoft Technology Licensing, Llc Radial menus with bezel gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
JP2012008954A (en) * 2010-06-28 2012-01-12 Brother Ind Ltd Input device, compound machine and input control program
US9329776B2 (en) 2010-12-27 2016-05-03 Sony Corporation Display control device, method and computer program product
JP2012138012A (en) * 2010-12-27 2012-07-19 Sony Corp Display control device and method
US9671930B2 (en) 2011-04-27 2017-06-06 Furuno Electric Co., Ltd. Cross-application information display device, information display method for harmonizing display positions of menu items
WO2013063372A1 (en) * 2011-10-26 2013-05-02 Google Inc. Detecting object moving toward or away from a computing device
US20170083191A1 (en) * 2012-08-14 2017-03-23 Fuji Xerox Co., Ltd. Display control device, image forming apparatus, display control method, and non-transitory computer readable medium
US9529496B2 (en) 2012-08-14 2016-12-27 Fuji Xerox Co., Ltd. Display control device, image forming apparatus, display control method, and non-transitory computer readable medium
CN103593045B (en) * 2012-08-14 2018-12-25 富士施乐株式会社 Display control unit, image forming apparatus and display control method
JP2014038444A (en) * 2012-08-14 2014-02-27 Fuji Xerox Co Ltd Display control device, and image forming apparatus and program
US9753613B2 (en) 2012-08-14 2017-09-05 Fuji Xerox Co., Ltd. Display control device with functionality in a response to an approaching object
CN103593045A (en) * 2012-08-14 2014-02-19 富士施乐株式会社 Display control device, image forming apparatus, and display control method
JP2014092988A (en) * 2012-11-05 2014-05-19 Ntt Docomo Inc Terminal device, screen display method, hover position correction method, and program
US10503359B2 (en) 2012-11-15 2019-12-10 Quantum Interface, Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same
JP2014232475A (en) * 2013-05-30 2014-12-11 京セラドキュメントソリューションズ株式会社 Display device, electronic apparatus, and image forming apparatus
JP2017500673A (en) * 2013-10-01 2017-01-05 クアンタム インターフェイス エルエルシー Selective attraction interface, systems and devices having such an interface, and methods of making and using the same
US9946383B2 (en) 2014-03-14 2018-04-17 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
JP2015028807A (en) * 2014-10-01 2015-02-12 株式会社リコー Operation display device and method
US10116812B2 (en) 2014-11-18 2018-10-30 Konica Minolta, Inc. Image forming apparatus, method for controlling the same, and non-transitory computer-readable data recording medium having control program stored thereon
JP2016099655A (en) * 2014-11-18 2016-05-30 コニカミノルタ株式会社 Image forming apparatus, control method of the same, and control program of the same
JP2016048557A (en) * 2015-09-18 2016-04-07 古野電気株式会社 Vessel information display device, vessel information display method and vessel information display program
JP2017037680A (en) * 2016-11-21 2017-02-16 富士ゼロックス株式会社 Display control device and program
JP2017142806A (en) * 2017-02-24 2017-08-17 富士ゼロックス株式会社 Display control device and program

Similar Documents

Publication Publication Date Title
KR101752825B1 (en) Location-based searching
EP1703257B1 (en) Point searching apparatus and method of searching
CN103562680B (en) For comparing and select the apparatus and method for of alternative navigation route
EP1530026B1 (en) Traffic-condition notifying device, system and method
EP2672230B1 (en) Method for providing navigation instructions while device is in locked mode
US9250092B2 (en) Map service with network-based query for search
JP4882319B2 (en) Information display device
US8560228B2 (en) System for displaying points of interest
US8694247B2 (en) Method, mobile device and computer-readable medium for displaying surrounding points of interest
AU2006321767B2 (en) Method and system for a user input solution for a limited telecommunication device
US9230556B2 (en) Voice instructions during navigation
US7817168B2 (en) Image display control apparatus and program for controlling same
JP2013524223A (en) Navigation or mapping apparatus and method
US7126579B2 (en) Method for requesting destination information and for navigating in a map view, computer program product and navigation unit
US6038508A (en) Vehicular navigation system and memory medium
US20120303263A1 (en) Optimization of navigation tools using spatial sorting
KR101300944B1 (en) Local search and mapping for mobile devices
ES2285434T3 (en) Navigation device with touch screen.
US20050027437A1 (en) Device, system, method and program for notifying traffic condition and recording medium storing the program
US9429435B2 (en) Interactive map
EP1503180A1 (en) Vehicle navigation device
US20090216732A1 (en) Method and apparatus for navigation system for searching objects based on multiple ranges of desired parameters
US6999875B2 (en) Display method and apparatus for navigation system
US9273979B2 (en) Adjustable destination icon in a map navigation tool
US8713011B2 (en) Navigation apparatus, search result display method, and graphical user interface