US20050049782A1 - Navigation apparatus - Google Patents

Navigation apparatus Download PDF

Info

Publication number
US20050049782A1
US20050049782A1 US10/434,348 US43434803A US2005049782A1 US 20050049782 A1 US20050049782 A1 US 20050049782A1 US 43434803 A US43434803 A US 43434803A US 2005049782 A1 US2005049782 A1 US 2005049782A1
Authority
US
United States
Prior art keywords
display
real image
area
screen
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/434,348
Other versions
US6868336B1 (en
Inventor
Masahiko Nakano
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Denso Ten Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2002-137430 priority Critical
Priority to JP2002137430A priority patent/JP2003329467A/en
Application filed by Denso Ten Ltd filed Critical Denso Ten Ltd
Assigned to FUJITSU TEN LIMITED reassignment FUJITSU TEN LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKANO, MASAHIKO
Publication of US20050049782A1 publication Critical patent/US20050049782A1/en
Publication of US6868336B1 publication Critical patent/US6868336B1/en
Application granted granted Critical
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUJITSU TEN LIMITED
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements of navigation systems
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams

Abstract

A map screen is displayed in a left half area of a display panel and an real image such as a satellite photograph depicting a surface of an earth and an aerial photograph is displayed in the remaining right half area thereof based on real image data. The real images on different scales (the image as the wide area display and the image as the detailed display) can be displayed in the right half area on the display panel. A default condition of two-screen display is set to display the real image as the wide area display in the right half-area.

Description

  • The present disclosure relates to the subject matter contained in Japanese Patent Application No. 2002-137430 filed on May 13, 2002, which is incorporated herein by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to a navigation apparatus and, more particularly, a navigation apparatus employing real image data that correspond to images such as a satellite photograph depicting the surface of the earth and an aerial photograph.
  • 2. Description of the Related Art
  • In the related art, a navigation apparatus displays a map on a screen of a display device based on roadmap data recorded in the DVD-ROM, etc., displays own location on the map based on own location data, and guides a user onto routes to a destination.
  • However, in the navigation apparatus in the related art, the map screen to be displayed is formed by using map data. Therefore, there is such a problem that it is difficult to understand own current location and grasp the actual situation around own current location through the map screen.
  • This is because it is difficult to display the vertical positional relationship such as roads in a solid crossing, on the map screen and because actually a large number of roads, buildings, and the like that are not displayed on the map screen are present.
  • As one of solutions for overcoming such problem, JP-A-Hei. 5-113343 discloses that own current location is displayed on an aerial photograph screen that is prepared from aerial photograph data. If such aerial photograph screen is employed, the building or the like serving as a mark can be very easily understood. Thus, there are such advantages that own current location can be easily understood and also the actual situation around own current location can be easily grasped.
  • However, the aerial photograph screen (prepared by using the aerial photograph data) does not in all cases make it possible to understand own current location more easily and grasp the situation around own current location more easily in contrast to the map screen (prepared by using the map data). Rather, in some cases own current location can be easily understood using the map screen rather than the aerial photograph screen.
  • Also, the aerial photograph data employed to prepare the real image such as the aerial photograph screen need a large quantity of data. Thus, if the aerial photograph data are stored in the memory device, the compression process must be applied. Therefore, if the aerial photograph data are employed, the original aerial photograph data must be obtained by applying the restoration (depression, expansion) process to the aerial photograph data, which have been subjected to the compression process, to display the real image.
  • The restoration process is not carried out in an instant. For example, it takes almost several seconds to display a sheet of the aerial photograph screen on the screen of the display device. Therefore, there is such a problem that, unlike the map screen in the related art, the screen cannot be continuously updated so as to respond to the moving of own location.
  • In this manner, the aerial photograph screen is not superior in all respects to the map screen. As a result, it is supposed that not a navigation apparatus utilizing the aerial photograph screen only (of course, a navigation apparatus utilizing the conventional map screen only) but a navigation apparatus system utilizing both the aerial photograph screen and the map screen becomes the mainstream in the next generation.
  • Accordingly, in the future, it becomes important to use properly the real image such as the aerial photograph screen and the map screen, which have above two different features.
  • SUMMARY OF THE INVENTION
  • The invention has been made in view of the above problems, and it is an object of the invention to provide a navigation apparatus, which can enhance a degree of user's satisfaction using properly both a display of a real image such as an aerial photograph screen and a map screen prepared by using the map data.
  • In order to achieve the above object, according to a first aspect of the invention, a navigation apparatus includes a display unit and a display control unit. On the display unit, first and second areas are defined. The display control unit includes a first control section and controls the display unit to display at least one of a map screen and a real image, based on map data, real image data, and a present location. When a first instruction signal is input to the display control unit, the first display control section controls the display unit to display the map screen in the first area and the real image on one of first to n-th scales in the second area, where n is an integer not smaller than two.
  • In the navigation apparatus, the map screen is displayed on the first area on the display unit, while the real image is displayed on the second area. For example, as shown in FIG. 8, the map screen can be displayed in the left half on the display screen, while the real image can be displayed in the remaining right half thereof. Therefore, not only the map screen but also the real image such as a satellite photograph can be displayed on the display screen. As a result, sufficient information can be offered to the user.
  • Also, the real image can be displayed on a plurality of scales different from each other (e.g., the image as the wide area display, and the image as the detailed display) in the second area. In addition, when a first instruction signal is input to the display control unit (for example, a user inputs “two screen display” instruction to the navigation apparatus), the first display control section controls the display unit to display the map screen in the first area and the real image on one of first to n-th scales (for example, the real image as the wide area display) in the second area.
  • Therefore, for example, when the display is switched from a state in which the map screen is displayed on the overall screen of the display screen (FIG. 6) to a state in which the map screen is displayed in the left half of the display screen and the real image is displayed in the remaining right half thereof, the real image having the one of the scales is displayed in the remaining right half thereof (FIG. 8).
  • Also, as described above, since the real images having different scales can be displayed in the second area, essentially the display control unit must receive from the user an instruction that instructs which scale should be selected to display the real image. However, in the navigation apparatus, since a default condition of display is set in the first display control section, the user is not requested to issue the instruction every time. Therefore, the navigation apparatus becomes very convenient in use for the user.
  • According to a second aspect of the invention, in the first aspect, the first area and the second area do not overlap each other.
  • According to a third aspect of the invention, in the first aspect, the one of the first to n-th scales is smaller than the other scales.
  • By the way, as explained in the above, the real image data such as the satellite photograph and the aerial photograph need a large quantity of data. Thus, if the real image data are stored in the memory device, the compression process must be applied. Therefore, if the real image data are employed, the original real image data must be obtained by applying the restoration (depression, expansion) process to the real image data, which have been subjected to the compression process, to display the real image.
  • Also, the restoration process is not carried out in an instant. For example, it takes almost several seconds to display a sheet of the aerial photograph screen on the screen of the display device. Therefore, unlike the map screen in the prior art, the screen cannot be continuously updated so as to correlate with the moving of own location.
  • Therefore, as shown in FIG. 8, in the case that the map screen is displayed in the left half on the display screen and the real image is displayed in the remaining right half, the map screen can be continuously updated so as to correlate with the moving of own location, nevertheless the screen of the real image cannot be continuously updated. That is, the real image is fixed.
  • Accordingly, if the real image having the large scale (i.e., the image as the detailed display) is displayed in the narrowed area such as the second area (e.g., the right half) on the display screen, an amount of the information that can be provided to the user becomes extremely small.
  • In contrast, according to the navigation apparatus, the real image on the one of the scales has the small scale relatively, and the real image as the wide area (not the real image as the detailed display) is displayed in the second area on the display screen. Therefore, even if the real image is displayed in the narrowed area such as the second area on the display screen, a large quantity of information can be provided to the user.
  • According to a fourth aspect of the invention, in any one of the first to third aspects, when a second instruction signal is input to the display control unit, the first display control section changes the scale of the real image displayed in the second area.
  • In the fourth aspect, when the instruction for switching the scale of the real image that is displayed in the second area is issued from the user, the real image having the scale that is different from the scale applied before switching is displayed in the second area. Therefore, the user can switch the scale of the real image as desired.
  • According to a fifth aspect of the invention, in any one of the first to fourth aspects, a third area is defined on the display unit. The display control unit further includes a second display control section. When a third instruction signal is input to the display control unit, the second display control section controls the display unit to display one of the map screen and the real image in the third area. When the display unit displays the one in the third area and a fourth instruction signal is input to the display control unit, the second display control section changes the one of the map screen and the real image to the other.
  • In the fifth aspect, the image displayed in the third area on the display screen can be switched from one of the map screen and the real image to the other. For example, when the map screen is displayed on the overall screen of the display screen (FIG. 6), the overall screen on the display screen can be switched from the map screen to the real image (FIG. 7).
  • According to a sixth aspect of the invention, in the fifth aspect, the third area contains the first and second areas.
  • According to a seventh aspect of the invention, in the fifth or sixth aspect, when the third instruction signal is input to the display control unit, the second display control section controls the display unit to display the real image on another of the first to n-th scales, which is different from the one of the first to n-th scales, in the third area.
  • Two types or more of real images having different scales (e.g., the image as the wide area display and the image as the detailed display) can be displayed in the third area. Also, a default condition of the display in the second display control section are set to display the real image having the second scale that is different from the one of the scales (e.g., the image as the detailed display) in the third area.
  • Therefore, if the display is switched from the state that the map screen is displayed in the overall screen of the display screen (FIG. 6) to the state that the real image is displayed on the overall screen of the display screen, the real image having the second scale is displayed in the overall screen on the display screen (FIG. 7).
  • Also, as described above, since two types or more of real images having different scales can be displayed in the third area, essentially the display control unit must receive from the user the instruction that instructs which scale should be selected to display the real image. However, in the navigation apparatus, since the default conditions of display in the second display controlling means are set, the user is not requested to issue the instruction every time. Therefore, the system becomes very convenient in use for the user. Further, the scale of the real image is different from the case where the real image is displayed in the second area on the display screen. Therefore, more significant information can be provided to the user.
  • According to an eighth aspect of the invention, in any one of the fifth to seventh aspects, when the display unit displays the real image in the third area and a fifth instruction signal is input to the display control unit, the second display control section changes the scale of the displayed real image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing schematically a pertinent portion of a navigation apparatus according to an embodiment (1) of the invention.
  • FIG. 2 is a flowchart showing processing operations executed by a microcomputer in the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 3 is a flowchart showing processing operations executed by the microcomputer in the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 4 is a flowchart showing processing operations executed by the microcomputer in the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 5 is a view showing an example of a screen state displayed on a display panel of the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 6 is a view showing an example of a screen state displayed on the display panel of the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 7 is a view showing an example of a screen state displayed on the display panel of the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 8 is a view showing an example of a screen state displayed on the display panel of the navigation apparatus according to the embodiment (1) of the invention.
  • FIG. 9 is a view showing an example of a screen state displayed on the display panel of the navigation apparatus according to the embodiment (1) of the invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • An embodiment of a navigation apparatus according to embodiments of the invention will be explained with reference to the drawings hereinafter. FIG. 1 is a block diagram showing schematically a pertinent portion of a navigation apparatus according to an embodiment (1).
  • A speed sensor 2 for acquiring information of the covered distance by executing a calculation based on a vehicle speed, and a gyro sensor 3 for acquiring information of the running direction are connected to a microcomputer 1. The microcomputer 1 can determine own vehicle location based on covered distance information and running direction information, both are calculated (self-contained navigation).
  • A GPS receiver 4 receives a GPS signal transmitted from a satellite via an antenna 5, and is connected to the microcomputer 1. The microcomputer 1 can determine own vehicle location based on the GPS signal (GPS navigation).
  • Also, a DVD drive 6 that can acquire map data and real image data from a DVD-ROM 7 (other storage devices are also available), in which the map data containing substantially the whole country are stored and also the real image data of the aerial photograph that depicts the surface of the earth are recorded with recorded areas are limited (e.g., major cities), is connected to the microcomputer 1. The microcomputer 1 reads necessary map data and real image data from the DVD-ROM 7 based on the own vehicle location, which has been determined, and then stores these data in a RAM la of the microcomputer 1.
  • Also, real image data on a scale of 1.4 m per pixel dot and real image data on a scale of 2.8 m per pixel dot are stored in the DVD-ROM 7 as the real image data of the satellite photograph that depicts the surface of the earth in the same area. When the real image is displayed as a wide-area display, the microcomputer 1 employs the real image data on the scale of 2.8 m per pixel dot, which is a small scale. Also, when the real image is displayed as a detailed display, the microcomputer 1 employs the real image data on the scale of 1.4 m per pixel dot, which is a large scale.
  • Also, the microcomputer 1 can display the map screen, on which the current location of own vehicle is displayed precisely, on a display panel 9 b by combining the determined current location of the own vehicle and the map data (executing the map matching process) . Also, a switch signal output from a joy stick 8 a or a button switch 8 b (e.g., a “two screen” button switch 8 b 1) provided on a remote controller 8 and a switch signal output from a button switch 9 a provided on a display device 9 are input into the microcomputer 1. Then, processes are executed in the microcomputer 1 in response to these switch signals.
  • Also, a plurality of infrared LEDs and a plurality of. phototransistors are arranged on the upper/lower and left/right sides of the display panel 9 b to oppose to each other such that a position of the display panel 9 b, on which the user's finger touches, can be detected. The microcomputer 1 can acquire the detected result.
  • Next, processing operations (1) executed by the microcomputer 1 in the navigation apparatus according to the embodiment (1) will be explained with reference to flowcharts shown in FIG. 2 to FIG. 4 hereinafter. First, it is decided whether or not a flag f1 indicating the state of a screen that is displayed on the display panel 9 b is 0 (step S1).
  • If it is decided that the flag f1 is 0, then the current location of own vehicle is calculated from the GPS signal and the like. (step S2). Then, the map screen showing the neighborhood of the current location of own vehicle is displayed on the display map 9 b based on the calculated current location of own vehicle and the map data stored in the RAM 1 a (step S3). FIG. 5 is a view showing a state in which the map screen is displayed on the display panel 9 b.
  • Then, based on the current location information of own vehicle and the real image data stored in the RAM 1 a, it is decided whether or not the real image data of the neighborhood of the current location of own vehicle are present (step S4). If it is decided that the real image data are present, a “satellite photograph” button (touch switch) for allowing the user to instruct switching the display from the map screen to the satellite photograph (real image) is formed (step S5). Then, a flag f2 indicating that the “satellite photograph” button is formed is set to 1 (step S6). FIG. 6 is a view showing the state that the “satellite photograph” button switch is formed on the display panel 9 b.
  • Then, it is decided whether or not the user touches on the “satellite photograph” button switch (step S7). If it is decided that the user touches on the “satellite photograph” button switch, the “satellite photograph” button switch is removed (step S8). Then, the flag f2 is set to 0 (step S9). Then, the real image indicating the neighborhood of the current location of own vehicle (as the detailed display) is displayed on the display panel 9 b based on the calculated current location information of own vehicle and the real image data (on the scale of 1.4 m per pixel dot) stored in the RAM 1 a (step S10). Then, a “map” button switch (touch switch) for allowing the user to instruct switching the display from the real image to the map screen is formed (step S11). Then, the flag f1 is set to 1 (step S12). FIG. 7 is a view showing a state in which the real image is displayed on the display panel 9 b.
  • In contrast, in step S7, if it is decided that the user does not touch on the “satellite photograph” button switch, it is decided whether or not the user pushes down a “two screen” button switch 8 b 1 provided on the remote controller 8 (step S13). If it is decided that the user pushes down the “two screen” button switch 8 b 1, it is decided whether or not the flag f2 indicating that the “satellite photograph” button switch is formed is 1 (step S14).
  • If it is decided that the flag f2 is 1 (i.e., the “satellite photograph” button switch is formed), the “satellite photograph” button switch is removed (step S15). Then, the flag f2 is set to 0 (step S16). Then, the process goes to step S17. In contrast, if it is decided that the flag f2 is not 1 (i.e., the “satellite photograph” button switch is not formed), there is no necessity that the “satellite photograph” button switch should be removed. Thus, the process goes to step S17 as it is.
  • In step S17, based on the current location information of own vehicle and the real image data stored in the RAM 1 a, it is decided whether or not the real image data of the neighborhood of the current location of own vehicle are present. Then, if it is decided that the real image data are present, a map screen showing the neighborhood of the current location of own vehicle is displayed in a left half area of the display panel 9 b based on the current location information of own vehicle and the map data stored in the RAM 1 a (step S18) . In addition, the real image showing the neighborhood of the current location of own vehicle (as the wide area display) is displayed in a right half area of the display panel 9 b based on the current location information of own vehicle and the real image data (in the scale of 2.8 m per pixel dot) stored in the RAM 1 a (step S19). The flag f1 is set to 2 (step S20).
  • Then, a “detail” button switch (touch switch) for allowing the user to instruct switching the display from the real image as the wide area display to the real image as the detailed display is formed (step S21). Then, a flag f3 indicating that the “detail” button switch is formed is set to 1 (step S22) . FIG. 8 is a view showing a state in which the map screen is displayed in the left half of the display panel 9 b and the real image is displayed in the remaining right half.
  • In contrast, in step S17, if it is decided that the real image data are not present, the map screen showing the neighborhood of the current location of own vehicle is displayed in the left half area of the display panel 9 b based on the current location information of own vehicle and the map data stored in the RAM 1 a (step S23). In addition, the map screen (as the wide area display) is displayed in the right half area of the display panel 9 b (step S24). Then, the flag f1 is set to 3 (step S25).
  • Then, the “detail” button switch (touch switch) for allowing the user to instruct switching the display from the map screen as the wide area display to the map screen as the detailed display is formed (step S26). Then, the flag f3 indicating that the “detail” button switch is formed is set to 1 (step S27) . FIG. 9 is a view showing a state in which the map screen is displayed in the left half area of the display panel 9 b and the map screen as the wide area display is displayed in the remaining right half.
  • By the way, in step S4, if it is decided that the real image data are not present, it is decided whether or not the flag f2 indicating the “satellite photograph” button switch is formed is 1 (step S28). Then, if it is decided that the flag f2 is 1 (i.e., the “satellite photograph” button switch is formed), the “satellite photograph” button switch is removed (step S29) . The flag f2is set to 0 (step S30) . Then, the process goes to step S13. In contrast, if it is decided that the flag f2 is not 1 (i.e., the “satellite photograph” button switch is not formed), there is no need that the “satellite photograph” button switch should be removed. Thus, the process goes to step S13 as it is.
  • Also, in step S1, if it is decided that the flag f1 showing the state of the screen displayed on the display panel 9 b is not 0, then it is decided whether or not the flag f1 is 1 (step S31). Then, if it is decided that the flag f1 is 1 (i.e., as shown in FIG. 7, the real image is displayed in the overall area of the display panel 9 b), it is decided whether or not the user touches on the “map” button switch (step S32).
  • If it is decided that the user touches on the “map” button switch, the flag f1 is set to 0 (step S33). Then, the process goes to step S2. In contrast, if it is decided that the user does not touch on the “map” button switch, the processing operations (1) are ended as they are. Also, in step S31, if it is decided that the flag f1 is not 1, the process goes to step S41 (FIG. 3).
  • In step S41, it is decided whether or not the flag f1 indicating the state of the screen that is displayed on the display panel 9 b is 2. Then, if it is concluded that the flag f1 is 2 (i.e., as shown in FIG. 8, the map screen is displayed in the left half area of the display panel 9 b and the real image is displayed in the remaining right half area), the current location information of own vehicle are calculated based on the GPS signal and the like (step S42). Then, the map screen showing the neighborhood of the current location of own vehicle is displayed in the left half area of the display panel 9 b based on the calculated current location information of own vehicle and the map data stored in the RAM 1 a (step S43). That is, the map screen is updated to meet to the moving of the vehicle.
  • Then, it is decided whether or not the flag f3 indicating that the “detail” button switch is formed is 1 (step 44). Then, if it is concluded that the flag f3 is 1 (i.e., the “detail” button switch is formed), it is decided whether or not the user touches on the “detail” button switch (step S45). Then, if it is concluded that the user touches the “detail” button switch, it is regarded that the user requests the real image as the detailed display. Thus, the real image showing the neighborhood of the current location of own vehicle (as the detailed display) is displayed in the right half area of the display panel 9 b based on the current location information of own vehicle and the real image data (on the scale of 1.4 m per pixel dot) stored in the RAM 1 a (step S46).
  • Then, the “detail” button switch is removed (step S47) The “wide area” button switch is formed instead of this (step S48). The flag f3 is set to 0 (step S49) . Then, the process goes to step S55. In contrast, in step S45, it is concluded that the user does not touch on the “detail” button switch, steps S46 to S49 are skipped and then the process goes to step S55 as it is.
  • Also, in step S44, it is concluded that the flag f3 is not 1 (i.e., the “wide area” button switch is formed), it is decided whether or not the user touches the “wide area” button switch (step S50). If it is concluded that the user touches on the “wide area” button switch, it is regarded that the user requests the real image as the wide area display. Thus, the real image showing the neighborhood of the current location of own vehicle (as the wide area display) is displayed in the right half area of the display panel 9 b based on the current location information of own vehicle and the real image data (on the scale of 2.8 m per pixel dot) stored in the RAM 1 a (step S51).
  • Then, the “wide area” button switch is removed (step S52) The “detail” button switch is formed in place of the “wide area” button switch (step S53). The flag f3 is set to 1 (step S54). The process goes to step S55. In contrast, in step S50, if it is concluded that the user does not touch on the “wide area” button switch, steps S51 to S54 are skipped and the process goes to step S55 as it is.
  • In step S55, it is decided whether or not the user pushes down the “two screen” button switch 8 b 1 provided on the remote controller 8. If it is concluded that the user pushes down the “two screen” button switch 8 b, it is regarded that the user requests the cancel of the two-screen display. Then, it is decided whether or not the flag f3 is 1 (step S56).
  • If it is concluded that the flag f3 is 1 (i.e., the “detail” button switch is formed) , the “detail” button is removed (step S57). The flag f3 is set to 0 ; (step S58). The process goes to step S2 (FIG. 2). In contrast, if it is concluded that the flag f3 is not 1 (i.e., the “wide area” button switch is formed), the “wide area” button is removed (step S59). Then, the process goes to step S2.
  • Also, in step S41, it is concluded that the flag f1 indicating the state of the screen that is displayed on the display panel 9 b is not 2 (i.e., the flag f3 is not any one of 0 to 2 but is 3, and as shown in FIG. 9, the map screen is displayed in the left half area of the display panel 9 b and the map screen is displayed in the remaining right half area on another scale), the process goes to step S61 (FIG. 4).
  • In step S61, the current location of own vehicle is calculated based on the GPS signal and the like. Then, the map screens indicating the neighborhood of the current location of own vehicle are displayed in a predetermined scale in the left and right half areas of the display panel 9 b, respectively based on the calculated current location information of own vehicle and the map data stored in the RAM 1 a (step S62). That is, the map screens displayed in the left and right half areas thereof respectively are updated in response to the moving of own vehicle.
  • Then, it is decided whether or not the flag f3 indicating that the “detail” button switch is formed is 1 (step S63). Then, if it is concluded that the flag f3 is 1 (i.e., the “detail” button switch is formed), it is decided whether or not the user touches on the “detail” button switch (step S64). When it is concluded that the user touches on the “detail” button switch, it is regarded that the user requests the map screen as the detailed display. Thus, the map screen showing the neighborhood of the current location of own vehicle (as the detailed display) is displayed in the right half area of the display panel 9 b based on the current location information of own vehicle and the map data stored in the RAM 1 a (step S65) Then, the “detail” button switch is removed (step S66) The “wide area” button switch is formed in place of the “detail” button switch (step S67). The flag f3 is set to 0 (step S68). The process goes to step S74. In contrast, in step S64, it is concluded that the user does not touch the “detail” button. switch, steps S65 to S68 are skipped and the process goes to step S74 as it is.
  • Also, in step S63, if it is concluded that the flag f3 is not 1 (i.e., the “wide area” button switch is formed), it is decided whether or not the user touches on the “wide area” button switch (step S69). If it is concluded that the user touches on the “wide area” button switch, it is regarded that the user requests the map screen as the wide area display. Thus, the map screen showing the neighborhood of the current location of own vehicle (as the wide area display) is displayed in the right half area of the display panel 9 b based on the current location information of own vehicle and the map data stored in the RAM 1 a (step S70).
  • Next, the “wide area” button switch is removed (step S71) The “detail” button is formed instead of this (step S72) The flag f3 is set to 1 (step S73) . The process goes to step S74. In contrast, in step S69, it is concluded that the user does not touch on the “wide area” button, steps S70 to S73 are skipped and the process goes to step S74 as it is.
  • In step S74, it is decided whether or not the user pushes down the “two screen” button switch 8 b 1 provided on the remote controller 8. When it is concluded that the user pushes down the “two screen” button switch 8 b 1, it is regarded that the user requests the cancel of the two-screen display. Then, it is decided whether or not the flag f3 is 1 (step S75).
  • If it is concluded that the flag f3 is 1 (i.e., the “detail” button switch is formed) , the “detail” button switch is removed (step S76). The flag f3 is set to 0 (step S77) . The process goes to step S2 (FIG. 2) . In contrast, if it is concluded that the flag f3 is not 1 (i.e., the “wide area” button switch is formed), the “wide area” button is removed (step S78). The process goes to step S2.
  • According to the navigation apparatus of the embodiment (1), the map screen can be displayed in the left half area on the display panel 9 b, while the real image can be displayed in the right half area. Therefore, not only the map screen but also the real image such as the satellite photograph can be displayed on the display panel 9 b. As a result, sufficient information can be offered to the user.
  • Also, when the display is switched from the state that the map screen is displayed on the overall screen of the display panel 9 b (FIG. 6) to the state that the map screen is displayed in the left half area on the display panel 9 b and the real image is displayed in the remaining right half area, the real image as the wide area display is displayed on the remaining right half area (FIG. 8). Therefore, even if the real image is displayed in the narrowed area such as the right half area on the display panel 9 b, a lot of information can be offered to the user.
  • Also, if the display is switched from the state that the map screen is displayed on the overall screen of the display panel 9 b (FIG. 6) to the state that the real image is displayed on the overall screen of the display panel 9 b, the real image as the detailed display is displayed on the overall area (FIG. 7). That is, the scale of the real image is different from that in a case where the real image is displayed in the right half area on the display panel 9 b. Therefore, more significant information can be offered to the user.
  • In the above described embodiment, the real image having a large scale and the real image having a small scale are switched and used for display in response to the selected button (“two screen”, “satellite photograph”, “detail”, and “wide area”) However, in another embodiment, the microcomputer 1 may read only the real image data on one scale from the DVD-ROM 7 and store it in the RAM 1 a. When it is required to change the scale of the displayed real image data, the microcomputer 1 may conduct enlargement/reduction processes on the stored real image data in response to the selected button and display the processed real image. In the another embodiment, an amount of the real image data stored in the RAM 1 a is reduced to half in comparison with the embodiment (1). Therefore, the RAM 1 a is used more effectively.
  • FIG. 1:
    • 1 microcomputer
    • 2 speed sensor
    • 3 gyro sensor
    • 4 GPS receiver
    • 5 DVD drive
      FIG. 2:
    • S2 calculate current location information of own vehicle
    • S3 display map screen (update)
    • S4 Is real image data present?
    • S5 form “satellite photograph” button
    • S7 “Satellite photograph” ON?
    • S8 remove “satellite photograph” button
    • S10 display real image
    • S11 form “map” button
    • S13 “Two screen” ON?
    • S15 remove “satellite photograph” button
    • S17 Is real image data present?
    • S18 display map screen in left half
    • S19 display real image (wide area) in right half
    • S21 form “detail” button
    • S23 display map screen in left half
    • S24 display map screen (wide area) in right half
    • S26 form “detail” button
    • S29 remove “satellite photograph” button
    • S32 “Map” ON?
      FIG. 3:
    • S42 calculate current location information of own vehicle
    • S43 update map screen in left half
    • S45 “Detail” ON?
    • S46 display real image (detail) in right half
    • S47 remove “detail” button
    • S48 form “wide area” button
    • S50 “Wide area” ON?
    • S51 display real image (wide area) in right half
    • S52 remove “wide area” button
    • S53 form “detail” button
    • S55 “Two screen” ON?
    • S57 remove “detail” button
    • S59 remove “wide area” button
      FIG. 4:
    • S61 calculate current location information of own vehicle
    • S62 update map screen in right and left halves, respectively
    • S64 “detail” ON?
    • S65 display map screen (detail) in right half
    • S66 remove “detail” button
    • S67 form “wide area” button
    • S69 “wide area” ON?
    • S70 display map screen (wide area) in right half
    • S71 remove “wide area” button
    • S72 form “detail” button
    • S74 “two screen” ON?
    • S76 remove “detail” button
    • S78 remove “wide area” button

Claims (8)

1. A navigation apparatus, comprising:
a display unit on which first and second areas are defined; and
a display control unit which includes a first display control section and controls the display unit to display at least one of a map screen and a real image, based on map data, real image data, and a present location, wherein:
when a first instruction signal is input to the display control unit, the first display control section controls the display unit to display the map screen in the first area and the real image on one of first to n-th scales in the second area, where n is an integer not smaller than two.
2. The navigation apparatus according to claim 1, wherein the first area and the second area do not overlap each other.
3. The navigation apparatus according to claim 1, wherein the one of the first to n-th scales is smaller than the other scales.
4. The navigation apparatus according to claim 1, wherein when a second instruction signal is input to the display control unit, the first display control section changes the scale of the real image displayed in the second area.
5. The navigation apparatus according claim 1, wherein:
a third area is defined on the display unit;
the display control unit further includes a second display control section;
when a third instruction signal is input to the display control unit, the second display control section controls the display unit to display one of the map screen and the real image in the third area;
when the display unit displays the one in the third area and a fourth instruction signal is input to the display control unit, the second display control section changes the one of the map screen and the real image to the other.
6. The navigation apparatus according to claim 5, wherein the third area contains the first and second areas.
7. The navigation apparatus according to claim 5, wherein when the third instruction signal is input to the display control unit, the second display control section controls the display unit to display the real image on another of the first to n-th scales, which is different from the one of the first to n-th scales, in the third area.
8. The navigation apparatus according to claim 5, wherein when the display unit displays the real image in the third area and a fifth instruction signal is input to the display control unit, the second display control section changes the scale of the displayed real image.
US10/434,348 2002-05-13 2003-05-09 Navigation apparatus Active US6868336B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2002-137430 2002-05-13
JP2002137430A JP2003329467A (en) 2002-05-13 2002-05-13 Navigation device

Publications (2)

Publication Number Publication Date
US20050049782A1 true US20050049782A1 (en) 2005-03-03
US6868336B1 US6868336B1 (en) 2005-03-15

Family

ID=29699189

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/434,348 Active US6868336B1 (en) 2002-05-13 2003-05-09 Navigation apparatus

Country Status (4)

Country Link
US (1) US6868336B1 (en)
JP (1) JP2003329467A (en)
KR (1) KR100486644B1 (en)
CN (1) CN100368775C (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060142943A1 (en) * 2004-12-27 2006-06-29 Yong Sun Park Navigation service method and terminal of enabling the method
WO2008037664A1 (en) * 2006-09-28 2008-04-03 Siemens Aktiengesellschaft Method for preparing encoded geoimage data
US20080133132A1 (en) * 2004-08-10 2008-06-05 Thomas Jung Method For Displaying Map Information
US20080221790A1 (en) * 2007-03-06 2008-09-11 Samsung Electronics Co. Ltd. Method and terminal for providing a route in a navigation system using satellite image
US20080284795A1 (en) * 2006-12-08 2008-11-20 Andreas Ebert Method and device for controlling the display of information in two regions of a display area in a transportation device
US20110090074A1 (en) * 2007-04-03 2011-04-21 Denso Corporation Vehicular meter unit and display device
US20150191119A1 (en) * 2012-07-20 2015-07-09 Toyota Jidosha Kabushiki Kaisha Vehicle periphery monitoring device and vehicle periphery monitoring system

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7376510B1 (en) * 2004-11-05 2008-05-20 Navteq North America, Llc Map display for a navigation system
JP4642497B2 (en) * 2005-02-10 2011-03-02 クラリオン株式会社 Navigation device
DE102005018082A1 (en) * 2005-04-19 2006-10-26 Robert Bosch Gmbh A method for three-dimensional representation of a digital road map,
JP2007127993A (en) 2005-11-07 2007-05-24 Matsushita Electric Ind Co Ltd Display apparatus and navigation apparatus
TWI317008B (en) * 2007-02-02 2009-11-11 Mitac Int Corp
US20090281727A1 (en) * 2008-05-06 2009-11-12 Matsushita Electric Industrial Co., Ltd. Car navigation system
US8427508B2 (en) 2009-06-25 2013-04-23 Nokia Corporation Method and apparatus for an augmented reality user interface
JP5384545B2 (en) * 2011-03-07 2014-01-08 本田技研工業株式会社 Navigation system, the navigation server, the navigation client and the navigation method
US9620033B2 (en) * 2012-03-23 2017-04-11 Nokia Technologies Oy Map
CN103839508A (en) * 2012-11-27 2014-06-04 联想(北京)有限公司 Electronic equipment, image display method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5884217A (en) * 1994-11-14 1999-03-16 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US6202026B1 (en) * 1997-08-07 2001-03-13 Aisin Aw Co., Ltd. Map display device and a recording medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05113343A (en) 1991-10-22 1993-05-07 Pioneer Electron Corp Navigation system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5884217A (en) * 1994-11-14 1999-03-16 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US6202026B1 (en) * 1997-08-07 2001-03-13 Aisin Aw Co., Ltd. Map display device and a recording medium

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080133132A1 (en) * 2004-08-10 2008-06-05 Thomas Jung Method For Displaying Map Information
US20060142943A1 (en) * 2004-12-27 2006-06-29 Yong Sun Park Navigation service method and terminal of enabling the method
US9470542B2 (en) 2006-09-28 2016-10-18 Siemens Aktiengesellschaft Method for the supply of encoded geoimage data
US20090276152A1 (en) * 2006-09-28 2009-11-05 Klaus Lukas Method for the supply of encoded geoimage data
WO2008037664A1 (en) * 2006-09-28 2008-04-03 Siemens Aktiengesellschaft Method for preparing encoded geoimage data
US20080284795A1 (en) * 2006-12-08 2008-11-20 Andreas Ebert Method and device for controlling the display of information in two regions of a display area in a transportation device
US9927255B2 (en) * 2006-12-08 2018-03-27 Volkswagen Ag Method and device for controlling the display of information in two regions of a display area in a transportation device
US20080221790A1 (en) * 2007-03-06 2008-09-11 Samsung Electronics Co. Ltd. Method and terminal for providing a route in a navigation system using satellite image
US8478521B2 (en) * 2007-03-06 2013-07-02 Samsung Electronics Co., Ltd. Method and terminal for providing a route in a navigation system using satellite image
US8762052B2 (en) 2007-03-06 2014-06-24 Samsung Electronics Co., Ltd. Method and terminal for providing a route in a navigation system using satellite image
US7963656B2 (en) * 2007-04-03 2011-06-21 Denso Corporation Vehicular meter unit and display device
US20110090074A1 (en) * 2007-04-03 2011-04-21 Denso Corporation Vehicular meter unit and display device
US20150191119A1 (en) * 2012-07-20 2015-07-09 Toyota Jidosha Kabushiki Kaisha Vehicle periphery monitoring device and vehicle periphery monitoring system
US10046701B2 (en) * 2012-07-20 2018-08-14 Toyota Jidosha Kabushiki Kaisha Vehicle periphery monitoring device and vehicle periphery monitoring system

Also Published As

Publication number Publication date
US6868336B1 (en) 2005-03-15
CN1460835A (en) 2003-12-10
JP2003329467A (en) 2003-11-19
KR100486644B1 (en) 2005-05-03
CN100368775C (en) 2008-02-13
KR20030088341A (en) 2003-11-19

Similar Documents

Publication Publication Date Title
JP3375258B2 (en) Map display method and apparatus and a navigation apparatus including the device
EP0829839B1 (en) Land vehicle navigation system with multi-screen mode selectivity
US5243529A (en) Navigation apparatus
US5396430A (en) On-board navigation apparatus
US7412326B2 (en) Map display device and map display method
EP0508787B1 (en) On-board navigation apparatus
JP4676684B2 (en) Vehicle information terminal
JP4038529B1 (en) Navigation system, a portable terminal device and the peripheral image display method
EP0505152B1 (en) Map display device
US4862374A (en) Navigational plotting system
US20030078724A1 (en) Image display
US5089826A (en) Navigation system for movable body
EP1288628A2 (en) Navigation device, map displaying method and image display device
JP4486175B2 (en) Three-dimensional map display device and method
EP1061340B1 (en) Vehicle-mounted display system and display method
CN101101219B (en) Vehicle-mounted displaying device and displaying method employed for the same
JP3644473B2 (en) Map display device and a recording medium
JP4062804B2 (en) The navigation device and a route display device
US20030120423A1 (en) Motor vehicle navigation system with enhanced map selection
US7840347B2 (en) Navigation system and route setting method
US6999875B2 (en) Display method and apparatus for navigation system
KR930009455B1 (en) Touch panel device
JP4597496B2 (en) Display device
WO1996015516A1 (en) Map display device for vehicle
CN1254661C (en) Image display equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKANO, MASAHIKO;REEL/FRAME:014062/0085

Effective date: 20030501

Owner name: FUJITSU TEN LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKANO, MASAHIKO;REEL/FRAME:014062/0085

Effective date: 20030501

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJITSU TEN LIMITED;REEL/FRAME:031307/0982

Effective date: 20130301

FPAY Fee payment

Year of fee payment: 12